Skip to content
Projects
Groups
Snippets
Help
Loading...
Help
Submit feedback
Contribute to GitLab
Sign in
Toggle navigation
haskell-gargantext
Project
Project
Details
Activity
Releases
Cycle Analytics
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Charts
Issues
194
Issues
194
List
Board
Labels
Milestones
Merge Requests
9
Merge Requests
9
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Charts
Wiki
Wiki
Snippets
Snippets
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Charts
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
gargantext
haskell-gargantext
Commits
b3a89d3a
Verified
Commit
b3a89d3a
authored
Feb 25, 2025
by
Przemyslaw Kaminski
Browse files
Options
Browse Files
Download
Plain Diff
Merge branch 'dev' into 444-dev-temporary-file-storage
parents
1c214e1c
945fd8d0
Pipeline
#7370
failed with stages
in 57 minutes and 3 seconds
Changes
73
Pipelines
1
Expand all
Hide whitespace changes
Inline
Side-by-side
Showing
73 changed files
with
1216 additions
and
273 deletions
+1216
-273
gargantext.cabal
gargantext.cabal
+10
-1
Types.hs
src/Gargantext/API/Admin/Auth/Types.hs
+2
-0
EnvTypes.hs
src/Gargantext/API/Admin/EnvTypes.hs
+11
-4
PolicyCheck.hs
src/Gargantext/API/Auth/PolicyCheck.hs
+7
-1
Dev.hs
src/Gargantext/API/Dev.hs
+5
-2
Errors.hs
src/Gargantext/API/Errors.hs
+2
-0
Types.hs
src/Gargantext/API/Errors/Types.hs
+27
-3
Backend.hs
src/Gargantext/API/Errors/Types/Backend.hs
+1
-0
HashedResponse.hs
src/Gargantext/API/HashedResponse.hs
+2
-0
List.hs
src/Gargantext/API/Ngrams/List.hs
+11
-5
Types.hs
src/Gargantext/API/Ngrams/Types.hs
+17
-0
Node.hs
src/Gargantext/API/Node.hs
+16
-9
New.hs
src/Gargantext/API/Node/Corpus/New.hs
+11
-6
Subcorpus.hs
src/Gargantext/API/Node/Corpus/Subcorpus.hs
+9
-7
Export.hs
src/Gargantext/API/Node/Document/Export.hs
+24
-13
Types.hs
src/Gargantext/API/Node/Document/Export/Types.hs
+18
-5
DocumentUpload.hs
src/Gargantext/API/Node/DocumentUpload.hs
+41
-4
Share.hs
src/Gargantext/API/Node/Share.hs
+15
-19
Prelude.hs
src/Gargantext/API/Prelude.hs
+6
-2
Client.hs
src/Gargantext/API/Routes/Client.hs
+78
-0
Corpus.hs
src/Gargantext/API/Routes/Named/Corpus.hs
+2
-2
Node.hs
src/Gargantext/API/Routes/Named/Node.hs
+2
-0
Private.hs
src/Gargantext/API/Routes/Named/Private.hs
+11
-9
Remote.hs
src/Gargantext/API/Routes/Named/Remote.hs
+78
-0
Share.hs
src/Gargantext/API/Routes/Named/Share.hs
+6
-4
Private.hs
src/Gargantext/API/Server/Named/Private.hs
+2
-2
Remote.hs
src/Gargantext/API/Server/Named/Remote.hs
+343
-0
ThrowAll.hs
src/Gargantext/API/ThrowAll.hs
+2
-0
Core.hs
src/Gargantext/Core.hs
+2
-0
Config.hs
src/Gargantext/Core/Config.hs
+5
-0
Ngrams.hs
src/Gargantext/Core/Text/Ngrams.hs
+1
-0
Types.hs
src/Gargantext/Core/Types.hs
+2
-0
Individu.hs
src/Gargantext/Core/Types/Individu.hs
+3
-1
Main.hs
src/Gargantext/Core/Types/Main.hs
+44
-1
Bridgeness.hs
src/Gargantext/Core/Viz/Graph/Bridgeness.hs
+0
-36
GEXF.hs
src/Gargantext/Core/Viz/Graph/GEXF.hs
+14
-3
Tools.hs
src/Gargantext/Core/Viz/Graph/Tools.hs
+26
-27
Worker.hs
src/Gargantext/Core/Worker.hs
+21
-4
Jobs.hs
src/Gargantext/Core/Worker/Jobs.hs
+2
-0
Types.hs
src/Gargantext/Core/Worker/Jobs/Types.hs
+89
-7
Types.hs
src/Gargantext/Core/Worker/Types.hs
+3
-1
Flow.hs
src/Gargantext/Database/Action/Flow.hs
+1
-20
Utils.hs
src/Gargantext/Database/Action/Flow/Utils.hs
+5
-14
Any.hs
src/Gargantext/Database/Admin/Types/Hyperdata/Any.hs
+2
-0
Contact.hs
src/Gargantext/Database/Admin/Types/Hyperdata/Contact.hs
+9
-5
Document.hs
src/Gargantext/Database/Admin/Types/Hyperdata/Document.hs
+3
-0
User.hs
src/Gargantext/Database/Admin/Types/Hyperdata/User.hs
+6
-0
Node.hs
src/Gargantext/Database/Admin/Types/Node.hs
+9
-0
Types.hs
src/Gargantext/Database/Query/Facet/Types.hs
+2
-0
Node.hs
src/Gargantext/Database/Query/Table/Node.hs
+27
-11
Children.hs
src/Gargantext/Database/Query/Table/Node/Children.hs
+3
-3
Error.hs
src/Gargantext/Database/Query/Table/Node/Error.hs
+3
-1
Context.hs
src/Gargantext/Database/Schema/Context.hs
+2
-0
Node.hs
src/Gargantext/Database/Schema/Node.hs
+40
-0
Orphans.hs
src/Gargantext/Orphans.hs
+5
-0
OpenAPI.hs
src/Gargantext/Orphans/OpenAPI.hs
+5
-1
UTCTime.hs
src/Gargantext/Utils/UTCTime.hs
+1
-0
stack.yaml
stack.yaml
+2
-1
start
start
+1
-1
Authentication.hs
test/Test/API/Authentication.hs
+1
-1
Private.hs
test/Test/API/Private.hs
+4
-1
Move.hs
test/Test/API/Private/Move.hs
+1
-1
Remote.hs
test/Test/API/Private/Remote.hs
+94
-0
Share.hs
test/Test/API/Private/Share.hs
+1
-1
Table.hs
test/Test/API/Private/Table.hs
+1
-1
Routes.hs
test/Test/API/Routes.hs
+3
-21
Setup.hs
test/Test/API/Setup.hs
+1
-0
UpdateList.hs
test/Test/API/UpdateList.hs
+1
-1
Instances.hs
test/Test/Instances.hs
+3
-0
Terms.hs
test/Test/Ngrams/Terms.hs
+5
-9
JSON.hs
test/Test/Offline/JSON.hs
+2
-0
ReverseProxy.hs
test/Test/Server/ReverseProxy.hs
+1
-1
Utils.hs
test/Test/Utils.hs
+1
-1
No files found.
gargantext.cabal
View file @
b3a89d3a
...
@@ -150,6 +150,7 @@ library
...
@@ -150,6 +150,7 @@ library
Gargantext.API.Prelude
Gargantext.API.Prelude
Gargantext.API.Public.Types
Gargantext.API.Public.Types
Gargantext.API.Routes
Gargantext.API.Routes
Gargantext.API.Routes.Client
Gargantext.API.Routes.Named
Gargantext.API.Routes.Named
Gargantext.API.Routes.Named.Annuaire
Gargantext.API.Routes.Named.Annuaire
Gargantext.API.Routes.Named.Contact
Gargantext.API.Routes.Named.Contact
...
@@ -166,6 +167,7 @@ library
...
@@ -166,6 +167,7 @@ library
Gargantext.API.Routes.Named.Private
Gargantext.API.Routes.Named.Private
Gargantext.API.Routes.Named.Public
Gargantext.API.Routes.Named.Public
Gargantext.API.Routes.Named.Publish
Gargantext.API.Routes.Named.Publish
Gargantext.API.Routes.Named.Remote
Gargantext.API.Routes.Named.Search
Gargantext.API.Routes.Named.Search
Gargantext.API.Routes.Named.Share
Gargantext.API.Routes.Named.Share
Gargantext.API.Routes.Named.Table
Gargantext.API.Routes.Named.Table
...
@@ -345,6 +347,7 @@ library
...
@@ -345,6 +347,7 @@ library
Gargantext.API.Server.Named.Ngrams
Gargantext.API.Server.Named.Ngrams
Gargantext.API.Server.Named.Private
Gargantext.API.Server.Named.Private
Gargantext.API.Server.Named.Public
Gargantext.API.Server.Named.Public
Gargantext.API.Server.Named.Remote
Gargantext.API.Server.Named.Viz
Gargantext.API.Server.Named.Viz
Gargantext.API.Swagger
Gargantext.API.Swagger
Gargantext.API.Table
Gargantext.API.Table
...
@@ -500,6 +503,7 @@ library
...
@@ -500,6 +503,7 @@ library
, cache >= 0.1.3.0
, cache >= 0.1.3.0
, case-insensitive ^>= 1.2.1.0
, case-insensitive ^>= 1.2.1.0
, cassava ^>= 0.5.2.0
, cassava ^>= 0.5.2.0
, cborg-json >= 0.2
, cereal ^>= 0.5.8.2
, cereal ^>= 0.5.8.2
, clock >= 0.8
, clock >= 0.8
, conduit ^>= 1.3.4.2
, conduit ^>= 1.3.4.2
...
@@ -586,11 +590,13 @@ library
...
@@ -586,11 +590,13 @@ library
, serialise ^>= 0.2.4.0
, serialise ^>= 0.2.4.0
, servant >= 0.20.1 && < 0.21
, servant >= 0.20.1 && < 0.21
, servant-auth ^>= 0.4.0.0
, servant-auth ^>= 0.4.0.0
, servant-auth-client
, servant-auth-server ^>=0.4.6.0
, servant-auth-server ^>=0.4.6.0
, servant-auth-swagger ^>= 0.2.10.1
, servant-auth-swagger ^>= 0.2.10.1
, servant-blaze ^>= 0.9.1
, servant-blaze ^>= 0.9.1
, servant-client >= 0.20 && < 0.21
, servant-client >= 0.20 && < 0.21
, servant-client-core >= 0.20 && < 0.21
, servant-client-core >= 0.20 && < 0.21
, servant-conduit >= 0.15 && < 0.17
, servant-ekg ^>= 0.3.1
, servant-ekg ^>= 0.3.1
, servant-routes < 0.2
, servant-routes < 0.2
, servant-openapi3 >= 2.0.1.6
, servant-openapi3 >= 2.0.1.6
...
@@ -604,6 +610,7 @@ library
...
@@ -604,6 +610,7 @@ library
, singletons ^>= 3.0.2
, singletons ^>= 3.0.2
, singletons-th >= 3.1 && < 3.2
, singletons-th >= 3.1 && < 3.2
, smtp-mail >= 0.3.0.0
, smtp-mail >= 0.3.0.0
, split >= 0.2.0
, stemmer == 0.5.2
, stemmer == 0.5.2
, stm >= 2.5.1.0 && < 2.6
, stm >= 2.5.1.0 && < 2.6
, stm-containers >= 1.2.0.3 && < 1.3
, stm-containers >= 1.2.0.3 && < 1.3
...
@@ -738,7 +745,7 @@ common testDependencies
...
@@ -738,7 +745,7 @@ common testDependencies
, servant-auth-client
, servant-auth-client
, servant-client >= 0.20 && < 0.21
, servant-client >= 0.20 && < 0.21
, servant-client-core >= 0.20 && < 0.21
, servant-client-core >= 0.20 && < 0.21
, servant-
websockets >= 2.0.0 && < 2.1
, servant-
conduit >= 0.15 && < 0.17
, shelly >= 1.9 && < 2
, shelly >= 1.9 && < 2
, stm >= 2.5.1.0 && < 2.6
, stm >= 2.5.1.0 && < 2.6
, streaming-commons
, streaming-commons
...
@@ -781,6 +788,7 @@ test-suite garg-test-tasty
...
@@ -781,6 +788,7 @@ test-suite garg-test-tasty
CLI.Phylo.Common
CLI.Phylo.Common
Paths_gargantext
Paths_gargantext
Test.API.Private.Move
Test.API.Private.Move
Test.API.Private.Remote
Test.API.Private.Share
Test.API.Private.Share
Test.API.Private.Table
Test.API.Private.Table
Test.API.Authentication
Test.API.Authentication
...
@@ -851,6 +859,7 @@ test-suite garg-test-hspec
...
@@ -851,6 +859,7 @@ test-suite garg-test-hspec
Test.API.Notifications
Test.API.Notifications
Test.API.Private
Test.API.Private
Test.API.Private.Move
Test.API.Private.Move
Test.API.Private.Remote
Test.API.Private.Share
Test.API.Private.Share
Test.API.Private.Table
Test.API.Private.Table
Test.API.Routes
Test.API.Routes
...
...
src/Gargantext/API/Admin/Auth/Types.hs
View file @
b3a89d3a
...
@@ -65,6 +65,8 @@ data AuthResponse = AuthResponse { _authRes_token :: Token
...
@@ -65,6 +65,8 @@ data AuthResponse = AuthResponse { _authRes_token :: Token
}
}
deriving
(
Generic
,
Eq
,
Show
)
deriving
(
Generic
,
Eq
,
Show
)
instance
NFData
AuthResponse
where
type
Token
=
Text
type
Token
=
Text
type
TreeId
=
NodeId
type
TreeId
=
NodeId
...
...
src/Gargantext/API/Admin/EnvTypes.hs
View file @
b3a89d3a
...
@@ -43,14 +43,14 @@ import Database.PostgreSQL.Simple (Connection)
...
@@ -43,14 +43,14 @@ import Database.PostgreSQL.Simple (Connection)
import
Gargantext.API.Admin.Orchestrator.Types
(
JobLog
,
noJobLog
)
import
Gargantext.API.Admin.Orchestrator.Types
(
JobLog
,
noJobLog
)
import
Gargantext.API.Errors.Types
(
BackendInternalError
)
import
Gargantext.API.Errors.Types
(
BackendInternalError
)
import
Gargantext.API.Prelude
(
GargM
,
IsGargServer
)
import
Gargantext.API.Prelude
(
GargM
,
IsGargServer
)
import
Gargantext.Core.Config
(
GargConfig
(
..
),
gc_mail_config
,
gc_nlp_config
,
HasJWTSettings
(
..
),
HasConfig
(
..
),
HasManager
(
..
))
import
Gargantext.Core.Mail.Types
(
HasMail
,
mailSettings
)
import
Gargantext.Core.NLP
(
HasNLPServer
(
..
),
nlpServerMap
)
import
Gargantext.Core.NodeStory
import
Gargantext.Core.Notifications.CentralExchange
qualified
as
CE
import
Gargantext.Core.Notifications.CentralExchange
qualified
as
CE
import
Gargantext.Core.Notifications.CentralExchange.Types
qualified
as
CET
import
Gargantext.Core.Notifications.CentralExchange.Types
qualified
as
CET
import
Gargantext.Core.Notifications.Dispatcher
(
Dispatcher
)
import
Gargantext.Core.Notifications.Dispatcher
(
Dispatcher
)
import
Gargantext.Core.Notifications.Dispatcher.Types
(
HasDispatcher
(
..
))
import
Gargantext.Core.Notifications.Dispatcher.Types
(
HasDispatcher
(
..
))
import
Gargantext.Core.Config
(
GargConfig
(
..
),
gc_mail_config
,
gc_nlp_config
,
HasJWTSettings
(
..
),
HasConfig
(
..
))
import
Gargantext.Core.Mail.Types
(
HasMail
,
mailSettings
)
import
Gargantext.Core.NLP
(
HasNLPServer
(
..
),
nlpServerMap
)
import
Gargantext.Core.NodeStory
import
Gargantext.Database.Prelude
(
HasConnectionPool
(
..
))
import
Gargantext.Database.Prelude
(
HasConnectionPool
(
..
))
import
Gargantext.Prelude
hiding
(
to
)
import
Gargantext.Prelude
hiding
(
to
)
import
Gargantext.System.Logging
import
Gargantext.System.Logging
...
@@ -144,6 +144,9 @@ instance CET.HasCentralExchangeNotification Env where
...
@@ -144,6 +144,9 @@ instance CET.HasCentralExchangeNotification Env where
c
<-
asks
(
view
env_config
)
c
<-
asks
(
view
env_config
)
liftBase
$
CE
.
notify
(
_gc_notifications_config
c
)
m
liftBase
$
CE
.
notify
(
_gc_notifications_config
c
)
m
instance
HasManager
Env
where
gargHttpManager
=
env_manager
data
FireWall
=
FireWall
{
unFireWall
::
Bool
}
data
FireWall
=
FireWall
{
unFireWall
::
Bool
}
data
MockEnv
=
MockEnv
data
MockEnv
=
MockEnv
...
@@ -176,6 +179,7 @@ instance HasLogger (GargM DevEnv BackendInternalError) where
...
@@ -176,6 +179,7 @@ instance HasLogger (GargM DevEnv BackendInternalError) where
data
DevEnv
=
DevEnv
data
DevEnv
=
DevEnv
{
_dev_env_config
::
!
GargConfig
{
_dev_env_config
::
!
GargConfig
,
_dev_env_manager
::
~
Manager
,
_dev_env_logger
::
!
(
Logger
(
GargM
DevEnv
BackendInternalError
))
,
_dev_env_logger
::
!
(
Logger
(
GargM
DevEnv
BackendInternalError
))
,
_dev_env_pool
::
!
(
Pool
Connection
)
,
_dev_env_pool
::
!
(
Pool
Connection
)
,
_dev_env_nodeStory
::
!
NodeStoryEnv
,
_dev_env_nodeStory
::
!
NodeStoryEnv
...
@@ -234,6 +238,9 @@ instance HasNodeArchiveStoryImmediateSaver DevEnv where
...
@@ -234,6 +238,9 @@ instance HasNodeArchiveStoryImmediateSaver DevEnv where
instance
HasMail
DevEnv
where
instance
HasMail
DevEnv
where
mailSettings
=
dev_env_config
.
gc_mail_config
mailSettings
=
dev_env_config
.
gc_mail_config
instance
HasManager
DevEnv
where
gargHttpManager
=
dev_env_manager
instance
HasNLPServer
DevEnv
where
instance
HasNLPServer
DevEnv
where
nlpServer
=
dev_env_config
.
gc_nlp_config
.
(
to
nlpServerMap
)
nlpServer
=
dev_env_config
.
gc_nlp_config
.
(
to
nlpServerMap
)
...
...
src/Gargantext/API/Auth/PolicyCheck.hs
View file @
b3a89d3a
...
@@ -33,6 +33,7 @@ module Gargantext.API.Auth.PolicyCheck (
...
@@ -33,6 +33,7 @@ module Gargantext.API.Auth.PolicyCheck (
,
nodePublishedEdit
,
nodePublishedEdit
,
moveChecks
,
moveChecks
,
publishChecks
,
publishChecks
,
remoteExportChecks
,
userMe
,
userMe
,
alwaysAllow
,
alwaysAllow
,
alwaysDeny
,
alwaysDeny
...
@@ -211,7 +212,7 @@ nodeNotDescendant :: AccessPolicyErrorReason
...
@@ -211,7 +212,7 @@ nodeNotDescendant :: AccessPolicyErrorReason
nodeNotDescendant
=
AccessPolicyErrorReason
"Node is not a direct descendant."
nodeNotDescendant
=
AccessPolicyErrorReason
"Node is not a direct descendant."
invalidUserPermissions
::
AccessPolicyErrorReason
invalidUserPermissions
::
AccessPolicyErrorReason
invalidUserPermissions
=
AccessPolicyErrorReason
"User not authorized to perform the operation."
invalidUserPermissions
=
AccessPolicyErrorReason
"User not authorized to perform the operation
(typically due to wrong ownership)
."
-------------------------------------------------------------------------------
-------------------------------------------------------------------------------
-- Smart constructors of access checks
-- Smart constructors of access checks
...
@@ -274,6 +275,11 @@ publishChecks :: NodeId -> BoolExpr AccessCheck
...
@@ -274,6 +275,11 @@ publishChecks :: NodeId -> BoolExpr AccessCheck
publishChecks
nodeId
=
publishChecks
nodeId
=
(
nodeUser
nodeId
`
BOr
`
nodeSuper
nodeId
)
(
nodeUser
nodeId
`
BOr
`
nodeSuper
nodeId
)
-- | A user can export a node if he/she owns it, or if that's a super.
remoteExportChecks
::
NodeId
->
BoolExpr
AccessCheck
remoteExportChecks
nodeId
=
(
nodeUser
nodeId
`
BOr
`
nodeSuper
nodeId
)
alwaysAllow
::
BoolExpr
AccessCheck
alwaysAllow
::
BoolExpr
AccessCheck
alwaysAllow
=
BConst
.
Positive
$
AC_always_allow
alwaysAllow
=
BConst
.
Positive
$
AC_always_allow
...
...
src/Gargantext/API/Dev.hs
View file @
b3a89d3a
...
@@ -14,19 +14,20 @@ module Gargantext.API.Dev where
...
@@ -14,19 +14,20 @@ module Gargantext.API.Dev where
import
Control.Lens
(
view
)
import
Control.Lens
(
view
)
import
Control.Monad
(
fail
)
import
Control.Monad
(
fail
)
import
Data.Pool
(
withResource
)
import
Database.PostgreSQL.Simple
qualified
as
PGS
import
Database.PostgreSQL.Simple
qualified
as
PGS
import
Data.Pool
(
withResource
)
import
Gargantext.API.Admin.EnvTypes
(
DevEnv
(
..
),
Mode
(
Dev
)
)
import
Gargantext.API.Admin.EnvTypes
(
DevEnv
(
..
),
Mode
(
Dev
)
)
import
Gargantext.API.Admin.Settings
(
newPool
)
import
Gargantext.API.Admin.Settings
(
newPool
)
import
Gargantext.API.Errors.Types
(
BackendInternalError
)
import
Gargantext.API.Errors.Types
(
BackendInternalError
)
import
Gargantext.API.Prelude
(
GargM
)
import
Gargantext.API.Prelude
(
GargM
)
import
Gargantext.Core.Config
(
_gc_database_config
)
import
Gargantext.Core.Config
(
_gc_database_config
)
import
Gargantext.Core.Config.Types
(
SettingsFile
(
..
))
import
Gargantext.Core.Config.Types
(
SettingsFile
(
..
))
import
Gargantext.Core.Config.Utils
(
readConfig
)
import
Gargantext.Core.NodeStory
(
fromDBNodeStoryEnv
)
import
Gargantext.Core.NodeStory
(
fromDBNodeStoryEnv
)
import
Gargantext.Database.Prelude
(
Cmd
,
CmdRandom
,
connPool
,
runCmd
)
import
Gargantext.Database.Prelude
(
Cmd
,
CmdRandom
,
connPool
,
runCmd
)
import
Gargantext.Prelude
import
Gargantext.Prelude
import
Gargantext.Core.Config.Utils
(
readConfig
)
import
Gargantext.System.Logging
(
withLoggerHoisted
)
import
Gargantext.System.Logging
(
withLoggerHoisted
)
import
Network.HTTP.Client.TLS
(
newTlsManager
)
import
Servant
(
ServerError
)
import
Servant
(
ServerError
)
-------------------------------------------------------------------
-------------------------------------------------------------------
...
@@ -41,8 +42,10 @@ withDevEnv settingsFile k = withLoggerHoisted Dev $ \logger -> do
...
@@ -41,8 +42,10 @@ withDevEnv settingsFile k = withLoggerHoisted Dev $ \logger -> do
--nodeStory_env <- fromDBNodeStoryEnv (_gc_repofilepath cfg)
--nodeStory_env <- fromDBNodeStoryEnv (_gc_repofilepath cfg)
pool
<-
newPool
(
_gc_database_config
cfg
)
pool
<-
newPool
(
_gc_database_config
cfg
)
nodeStory_env
<-
fromDBNodeStoryEnv
pool
nodeStory_env
<-
fromDBNodeStoryEnv
pool
manager
<-
newTlsManager
pure
$
DevEnv
pure
$
DevEnv
{
_dev_env_pool
=
pool
{
_dev_env_pool
=
pool
,
_dev_env_manager
=
manager
,
_dev_env_logger
=
logger
,
_dev_env_logger
=
logger
,
_dev_env_nodeStory
=
nodeStory_env
,
_dev_env_nodeStory
=
nodeStory_env
,
_dev_env_config
=
cfg
,
_dev_env_config
=
cfg
...
...
src/Gargantext/API/Errors.hs
View file @
b3a89d3a
...
@@ -176,6 +176,8 @@ nodeErrorToFrontendError ne = case ne of
...
@@ -176,6 +176,8 @@ nodeErrorToFrontendError ne = case ne of
->
mkFrontendErrShow
$
FE_node_is_read_only
nodeId
reason
->
mkFrontendErrShow
$
FE_node_is_read_only
nodeId
reason
MoveError
sourceId
targetId
reason
MoveError
sourceId
targetId
reason
->
mkFrontendErrShow
$
FE_node_move_error
sourceId
targetId
reason
->
mkFrontendErrShow
$
FE_node_move_error
sourceId
targetId
reason
NodeNotExportable
nodeId
reason
->
mkFrontendErrShow
$
FE_node_export_error
nodeId
reason
-- backward-compatibility shims, to remove eventually.
-- backward-compatibility shims, to remove eventually.
DoesNotExist
nid
DoesNotExist
nid
...
...
src/Gargantext/API/Errors/Types.hs
View file @
b3a89d3a
...
@@ -35,6 +35,7 @@ module Gargantext.API.Errors.Types (
...
@@ -35,6 +35,7 @@ module Gargantext.API.Errors.Types (
,
GraphQLError
(
..
)
,
GraphQLError
(
..
)
,
ToFrontendErrorData
(
..
)
,
ToFrontendErrorData
(
..
)
,
AccessPolicyErrorReason
(
..
)
,
AccessPolicyErrorReason
(
..
)
,
HasBackendInternalError
(
..
)
-- * Constructing frontend errors
-- * Constructing frontend errors
,
mkFrontendErrNoDiagnostic
,
mkFrontendErrNoDiagnostic
...
@@ -48,8 +49,8 @@ module Gargantext.API.Errors.Types (
...
@@ -48,8 +49,8 @@ module Gargantext.API.Errors.Types (
import
Control.Lens
((
#
),
makePrisms
,
Prism
'
)
import
Control.Lens
((
#
),
makePrisms
,
Prism
'
)
import
Control.Monad.Fail
(
fail
)
import
Control.Monad.Fail
(
fail
)
import
Data.Aeson
(
Value
(
..
),
(
.:
),
(
.=
),
object
,
withObject
)
import
Data.Aeson.Types
(
typeMismatch
,
emptyArray
)
import
Data.Aeson.Types
(
typeMismatch
,
emptyArray
)
import
Data.Aeson
(
Value
(
..
),
(
.:
),
(
.=
),
object
,
withObject
)
import
Data.List.NonEmpty
qualified
as
NE
import
Data.List.NonEmpty
qualified
as
NE
import
Data.Singletons.TH
(
SingI
(
sing
),
SingKind
(
fromSing
)
)
import
Data.Singletons.TH
(
SingI
(
sing
),
SingKind
(
fromSing
)
)
import
Data.Text
qualified
as
T
import
Data.Text
qualified
as
T
...
@@ -67,6 +68,7 @@ import Gargantext.Prelude hiding (Location, WithStacktrace)
...
@@ -67,6 +68,7 @@ import Gargantext.Prelude hiding (Location, WithStacktrace)
import
Gargantext.Utils.Dict
(
Dict
(
..
))
import
Gargantext.Utils.Dict
(
Dict
(
..
))
import
Gargantext.Utils.Jobs.Monad
qualified
as
Jobs
import
Gargantext.Utils.Jobs.Monad
qualified
as
Jobs
import
Servant
(
ServerError
)
import
Servant
(
ServerError
)
import
Control.Lens.Prism
(
prism'
)
-- | A 'WithStacktrace' carries an error alongside its
-- | A 'WithStacktrace' carries an error alongside its
-- 'CallStack', to be able to print the correct source location
-- 'CallStack', to be able to print the correct source location
...
@@ -121,6 +123,12 @@ data BackendInternalError
...
@@ -121,6 +123,12 @@ data BackendInternalError
makePrisms
''
B
ackendInternalError
makePrisms
''
B
ackendInternalError
class
HasBackendInternalError
e
where
_BackendInternalError
::
Prism'
e
BackendInternalError
instance
HasBackendInternalError
BackendInternalError
where
_BackendInternalError
=
prism'
identity
Just
instance
ToJSON
BackendInternalError
where
instance
ToJSON
BackendInternalError
where
toJSON
(
InternalJobError
s
)
=
toJSON
(
InternalJobError
s
)
=
object
[
(
"status"
,
toJSON
(
"IsFailure"
::
Text
))
object
[
(
"status"
,
toJSON
(
"IsFailure"
::
Text
))
...
@@ -258,8 +266,8 @@ newtype instance ToFrontendErrorData 'EC_400__node_creation_failed_no_parent =
...
@@ -258,8 +266,8 @@ newtype instance ToFrontendErrorData 'EC_400__node_creation_failed_no_parent =
data
instance
ToFrontendErrorData
'E
C
_400__node_creation_failed_insert_node
=
data
instance
ToFrontendErrorData
'E
C
_400__node_creation_failed_insert_node
=
FE_node_creation_failed_insert_node
{
necin_user_id
::
UserId
FE_node_creation_failed_insert_node
{
necin_user_id
::
UserId
,
necin_parent_id
::
ParentId
,
necin_parent_id
::
Maybe
ParentId
}
}
deriving
(
Show
,
Eq
,
Generic
)
deriving
(
Show
,
Eq
,
Generic
)
newtype
instance
ToFrontendErrorData
'E
C
_500__node_generic_exception
=
newtype
instance
ToFrontendErrorData
'E
C
_500__node_generic_exception
=
...
@@ -278,6 +286,10 @@ data instance ToFrontendErrorData 'EC_403__node_move_error =
...
@@ -278,6 +286,10 @@ data instance ToFrontendErrorData 'EC_403__node_move_error =
FE_node_move_error
{
nme_source_id
::
!
NodeId
,
nme_target_id
::
!
NodeId
,
nme_reason
::
!
T
.
Text
}
FE_node_move_error
{
nme_source_id
::
!
NodeId
,
nme_target_id
::
!
NodeId
,
nme_reason
::
!
T
.
Text
}
deriving
(
Show
,
Eq
,
Generic
)
deriving
(
Show
,
Eq
,
Generic
)
data
instance
ToFrontendErrorData
'E
C
_403__node_export_error
=
FE_node_export_error
{
nee_node_id
::
!
NodeId
,
nee_reason
::
!
T
.
Text
}
deriving
(
Show
,
Eq
,
Generic
)
--
--
-- validation errors
-- validation errors
--
--
...
@@ -514,6 +526,15 @@ instance FromJSON (ToFrontendErrorData 'EC_403__node_move_error) where
...
@@ -514,6 +526,15 @@ instance FromJSON (ToFrontendErrorData 'EC_403__node_move_error) where
nme_reason
<-
o
.:
"reason"
nme_reason
<-
o
.:
"reason"
pure
FE_node_move_error
{
..
}
pure
FE_node_move_error
{
..
}
instance
ToJSON
(
ToFrontendErrorData
'E
C
_403__node_export_error
)
where
toJSON
FE_node_export_error
{
..
}
=
object
[
"node_id"
.=
toJSON
nee_node_id
,
"reason"
.=
toJSON
nee_reason
]
instance
FromJSON
(
ToFrontendErrorData
'E
C
_403__node_export_error
)
where
parseJSON
=
withObject
"FE_node_move_error"
$
\
o
->
do
nee_node_id
<-
o
.:
"node_id"
nee_reason
<-
o
.:
"reason"
pure
FE_node_export_error
{
..
}
--
--
-- validation errors
-- validation errors
--
--
...
@@ -728,6 +749,9 @@ instance FromJSON FrontendError where
...
@@ -728,6 +749,9 @@ instance FromJSON FrontendError where
EC_403__node_move_error
->
do
EC_403__node_move_error
->
do
(
fe_data
::
ToFrontendErrorData
'E
C
_403__node_move_error
)
<-
o
.:
"data"
(
fe_data
::
ToFrontendErrorData
'E
C
_403__node_move_error
)
<-
o
.:
"data"
pure
FrontendError
{
..
}
pure
FrontendError
{
..
}
EC_403__node_export_error
->
do
(
fe_data
::
ToFrontendErrorData
'E
C
_403__node_export_error
)
<-
o
.:
"data"
pure
FrontendError
{
..
}
-- validation error
-- validation error
EC_400__validation_error
->
do
EC_400__validation_error
->
do
...
...
src/Gargantext/API/Errors/Types/Backend.hs
View file @
b3a89d3a
...
@@ -35,6 +35,7 @@ data BackendErrorCode
...
@@ -35,6 +35,7 @@ data BackendErrorCode
|
EC_400__node_needs_configuration
|
EC_400__node_needs_configuration
|
EC_403__node_is_read_only
|
EC_403__node_is_read_only
|
EC_403__node_move_error
|
EC_403__node_move_error
|
EC_403__node_export_error
-- validation errors
-- validation errors
|
EC_400__validation_error
|
EC_400__validation_error
-- policy check errors
-- policy check errors
...
...
src/Gargantext/API/HashedResponse.hs
View file @
b3a89d3a
...
@@ -19,6 +19,8 @@ import Gargantext.Prelude.Crypto.Hash qualified as Crypto (hash)
...
@@ -19,6 +19,8 @@ import Gargantext.Prelude.Crypto.Hash qualified as Crypto (hash)
data
HashedResponse
a
=
HashedResponse
{
hash
::
Text
,
value
::
a
}
data
HashedResponse
a
=
HashedResponse
{
hash
::
Text
,
value
::
a
}
deriving
(
Generic
)
deriving
(
Generic
)
instance
NFData
a
=>
NFData
(
HashedResponse
a
)
where
instance
ToSchema
a
=>
ToSchema
(
HashedResponse
a
)
instance
ToSchema
a
=>
ToSchema
(
HashedResponse
a
)
instance
ToJSON
a
=>
ToJSON
(
HashedResponse
a
)
where
instance
ToJSON
a
=>
ToJSON
(
HashedResponse
a
)
where
toJSON
=
genericToJSON
defaultOptions
toJSON
=
genericToJSON
defaultOptions
...
...
src/Gargantext/API/Ngrams/List.hs
View file @
b3a89d3a
...
@@ -10,6 +10,7 @@ Portability : POSIX
...
@@ -10,6 +10,7 @@ Portability : POSIX
-}
-}
{-# LANGUAGE ScopedTypeVariables #-}
{-# LANGUAGE ScopedTypeVariables #-}
{-# LANGUAGE TemplateHaskell #-}
{-# LANGUAGE TypeApplications #-}
{-# LANGUAGE TypeApplications #-}
{-# LANGUAGE ViewPatterns #-}
{-# LANGUAGE ViewPatterns #-}
...
@@ -21,19 +22,19 @@ import Data.ByteString.Lazy qualified as BSL
...
@@ -21,19 +22,19 @@ import Data.ByteString.Lazy qualified as BSL
import
Data.Csv
qualified
as
Tsv
import
Data.Csv
qualified
as
Tsv
import
Data.HashMap.Strict
(
HashMap
)
import
Data.HashMap.Strict
(
HashMap
)
import
Data.HashMap.Strict
qualified
as
HashMap
import
Data.HashMap.Strict
qualified
as
HashMap
import
Data.Map.Strict
(
toList
)
import
Data.Map.Strict
qualified
as
Map
import
Data.Map.Strict
qualified
as
Map
import
Data.Map.Strict
(
toList
)
import
Data.Set
qualified
as
Set
import
Data.Set
qualified
as
Set
import
Data.Text
(
concat
,
pack
,
splitOn
)
import
Data.Text
(
concat
,
pack
,
splitOn
)
import
Data.Text.Encoding
qualified
as
TE
import
Data.Text.Encoding
qualified
as
TE
import
Data.Vector
(
Vector
)
import
Data.Vector
qualified
as
Vec
import
Data.Vector
qualified
as
Vec
import
Data.Vector
(
Vector
)
import
Database.PostgreSQL.Simple.LargeObjects
qualified
as
PSQL
import
Database.PostgreSQL.Simple.LargeObjects
qualified
as
PSQL
import
Gargantext.API.Admin.EnvTypes
(
Env
)
import
Gargantext.API.Admin.EnvTypes
(
Env
)
import
Gargantext.API.Errors.Types
(
BackendInternalError
(
InternalServerError
))
import
Gargantext.API.Errors.Types
(
BackendInternalError
(
InternalServerError
))
import
Gargantext.API.Ngrams
(
setListNgrams
)
import
Gargantext.API.Ngrams.List.Types
(
_wjf_data
,
_wtf_data
)
import
Gargantext.API.Ngrams.List.Types
(
_wjf_data
,
_wtf_data
)
import
Gargantext.API.Ngrams.Prelude
(
getNgramsList
)
import
Gargantext.API.Ngrams.Prelude
(
getNgramsList
)
import
Gargantext.API.Ngrams
(
setListNgrams
)
import
Gargantext.API.Ngrams.Types
import
Gargantext.API.Ngrams.Types
import
Gargantext.API.Prelude
(
GargM
,
serverError
,
HasServerError
)
import
Gargantext.API.Prelude
(
GargM
,
serverError
,
HasServerError
)
import
Gargantext.API.Routes.Named.List
qualified
as
Named
import
Gargantext.API.Routes.Named.List
qualified
as
Named
...
@@ -50,6 +51,7 @@ import Gargantext.Database.Schema.Ngrams ( text2ngrams, NgramsId )
...
@@ -50,6 +51,7 @@ import Gargantext.Database.Schema.Ngrams ( text2ngrams, NgramsId )
import
Gargantext.Database.Schema.Node
(
_node_parent_id
)
import
Gargantext.Database.Schema.Node
(
_node_parent_id
)
import
Gargantext.Database.Types
(
Indexed
(
..
))
import
Gargantext.Database.Types
(
Indexed
(
..
))
import
Gargantext.Prelude
hiding
(
concat
,
toList
)
import
Gargantext.Prelude
hiding
(
concat
,
toList
)
import
Gargantext.System.Logging
(
logLocM
,
LogLevel
(
..
),
MonadLogger
)
import
Gargantext.Utils.Jobs.Monad
(
MonadJobStatus
(
..
))
import
Gargantext.Utils.Jobs.Monad
(
MonadJobStatus
(
..
))
import
Prelude
qualified
import
Prelude
qualified
import
Protolude
qualified
as
P
import
Protolude
qualified
as
P
...
@@ -120,7 +122,7 @@ jsonPostAsync = Named.JSONAPI {
...
@@ -120,7 +122,7 @@ jsonPostAsync = Named.JSONAPI {
}
}
------------------------------------------------------------------------
------------------------------------------------------------------------
postAsyncJSON
::
(
HasNodeStory
env
err
m
,
MonadJobStatus
m
)
postAsyncJSON
::
(
HasNodeStory
env
err
m
,
MonadJobStatus
m
,
MonadLogger
m
)
=>
ListId
=>
ListId
->
NgramsList
->
NgramsList
->
JobHandle
m
->
JobHandle
m
...
@@ -129,13 +131,17 @@ postAsyncJSON l ngramsList jobHandle = do
...
@@ -129,13 +131,17 @@ postAsyncJSON l ngramsList jobHandle = do
markStarted
2
jobHandle
markStarted
2
jobHandle
$
(
logLocM
)
DEBUG
"[postAsyncJSON] Setting the Ngrams list ..."
setList
setList
$
(
logLocM
)
DEBUG
"[postAsyncJSON] Done."
markProgress
1
jobHandle
markProgress
1
jobHandle
corpus_node
<-
getNode
l
-- (Proxy :: Proxy HyperdataList)
corpus_node
<-
getNode
l
-- (Proxy :: Proxy HyperdataList)
let
corpus_id
=
fromMaybe
(
panicTrace
"no parent_id"
)
(
_node_parent_id
corpus_node
)
let
corpus_id
=
fromMaybe
(
panicTrace
"no parent_id"
)
(
_node_parent_id
corpus_node
)
$
(
logLocM
)
DEBUG
"[postAsyncJSON] Executing re-indexing..."
_
<-
reIndexWith
corpus_id
l
NgramsTerms
(
Set
.
fromList
[
MapTerm
,
CandidateTerm
])
_
<-
reIndexWith
corpus_id
l
NgramsTerms
(
Set
.
fromList
[
MapTerm
,
CandidateTerm
])
$
(
logLocM
)
DEBUG
"[postAsyncJSON] Re-indexing done."
markComplete
jobHandle
markComplete
jobHandle
...
@@ -215,7 +221,7 @@ tsvToNgramsTableMap record = case Vec.toList record of
...
@@ -215,7 +221,7 @@ tsvToNgramsTableMap record = case Vec.toList record of
-- | This is for debugging the TSV parser in the REPL
-- | This is for debugging the TSV parser in the REPL
importTsvFile
::
forall
env
err
m
.
(
HasNodeStory
env
err
m
,
HasServerError
err
,
MonadJobStatus
m
)
importTsvFile
::
forall
env
err
m
.
(
HasNodeStory
env
err
m
,
HasServerError
err
,
MonadJobStatus
m
,
MonadLogger
m
)
=>
ListId
->
P
.
FilePath
->
m
()
=>
ListId
->
P
.
FilePath
->
m
()
importTsvFile
lId
fp
=
do
importTsvFile
lId
fp
=
do
contents
<-
liftBase
$
P
.
readFile
fp
contents
<-
liftBase
$
P
.
readFile
fp
...
...
src/Gargantext/API/Ngrams/Types.hs
View file @
b3a89d3a
...
@@ -16,6 +16,7 @@ Portability : POSIX
...
@@ -16,6 +16,7 @@ Portability : POSIX
{-# LANGUAGE TypeOperators #-}
{-# LANGUAGE TypeOperators #-}
{-# LANGUAGE TypeFamilies #-}
{-# LANGUAGE TypeFamilies #-}
{-# OPTIONS -fno-warn-orphans #-}
-- some instances are orphaned here
{-# OPTIONS -fno-warn-orphans #-}
-- some instances are orphaned here
{-# LANGUAGE StandaloneDeriving #-}
module
Gargantext.API.Ngrams.Types
where
module
Gargantext.API.Ngrams.Types
where
...
@@ -98,6 +99,8 @@ newtype MSet a = MSet (Map a ())
...
@@ -98,6 +99,8 @@ newtype MSet a = MSet (Map a ())
deriving
newtype
(
Semigroup
,
Monoid
)
deriving
newtype
(
Semigroup
,
Monoid
)
deriving
anyclass
(
ToExpr
)
deriving
anyclass
(
ToExpr
)
instance
NFData
a
=>
NFData
(
MSet
a
)
where
instance
ToJSON
a
=>
ToJSON
(
MSet
a
)
where
instance
ToJSON
a
=>
ToJSON
(
MSet
a
)
where
toJSON
(
MSet
m
)
=
toJSON
(
Map
.
keys
m
)
toJSON
(
MSet
m
)
=
toJSON
(
Map
.
keys
m
)
toEncoding
(
MSet
m
)
=
toEncoding
(
Map
.
keys
m
)
toEncoding
(
MSet
m
)
=
toEncoding
(
Map
.
keys
m
)
...
@@ -171,6 +174,7 @@ instance FromField NgramsRepoElement where
...
@@ -171,6 +174,7 @@ instance FromField NgramsRepoElement where
fromField
=
fromJSONField
fromField
=
fromJSONField
instance
ToField
NgramsRepoElement
where
instance
ToField
NgramsRepoElement
where
toField
=
toJSONField
toField
=
toJSONField
instance
NFData
NgramsRepoElement
where
data
NgramsElement
=
data
NgramsElement
=
NgramsElement
{
_ne_ngrams
::
NgramsTerm
NgramsElement
{
_ne_ngrams
::
NgramsTerm
...
@@ -201,6 +205,7 @@ newNgramsElement mayList ngrams =
...
@@ -201,6 +205,7 @@ newNgramsElement mayList ngrams =
instance
ToSchema
NgramsElement
where
instance
ToSchema
NgramsElement
where
declareNamedSchema
=
genericDeclareNamedSchema
(
unPrefixSwagger
"_ne_"
)
declareNamedSchema
=
genericDeclareNamedSchema
(
unPrefixSwagger
"_ne_"
)
instance
NFData
NgramsElement
where
------------------------------------------------------------------------
------------------------------------------------------------------------
newtype
NgramsTable
=
NgramsTable
[
NgramsElement
]
newtype
NgramsTable
=
NgramsTable
[
NgramsElement
]
...
@@ -209,6 +214,7 @@ newtype NgramsTable = NgramsTable [NgramsElement]
...
@@ -209,6 +214,7 @@ newtype NgramsTable = NgramsTable [NgramsElement]
deriving
anyclass
(
ToExpr
)
deriving
anyclass
(
ToExpr
)
-- type NgramsList = NgramsTable
-- type NgramsList = NgramsTable
instance
NFData
NgramsTable
where
makePrisms
''
N
gramsTable
makePrisms
''
N
gramsTable
...
@@ -379,6 +385,10 @@ newtype PatchMSet a = PatchMSet (PatchMap a AddRem)
...
@@ -379,6 +385,10 @@ newtype PatchMSet a = PatchMSet (PatchMap a AddRem)
deriving
stock
(
Eq
,
Show
,
Generic
)
deriving
stock
(
Eq
,
Show
,
Generic
)
deriving
newtype
(
Validity
,
Semigroup
,
Monoid
,
Group
,
Transformable
,
Composable
)
deriving
newtype
(
Validity
,
Semigroup
,
Monoid
,
Group
,
Transformable
,
Composable
)
deriving
anyclass
instance
(
NFData
k
,
NFData
v
)
=>
NFData
(
PatchMap
k
v
)
deriving
anyclass
instance
NFData
a
=>
NFData
(
Replace
a
)
instance
NFData
a
=>
NFData
(
PatchMSet
a
)
where
unPatchMSet
::
PatchMSet
a
->
PatchMap
a
AddRem
unPatchMSet
::
PatchMSet
a
->
PatchMap
a
AddRem
unPatchMSet
(
PatchMSet
a
)
=
a
unPatchMSet
(
PatchMSet
a
)
=
a
...
@@ -441,6 +451,8 @@ data NgramsPatch
...
@@ -441,6 +451,8 @@ data NgramsPatch
}
}
deriving
(
Eq
,
Show
,
Generic
)
deriving
(
Eq
,
Show
,
Generic
)
instance
NFData
NgramsPatch
where
-- The JSON encoding is untagged, this is OK since the field names are disjoints and thus the encoding is unambiguous.
-- The JSON encoding is untagged, this is OK since the field names are disjoints and thus the encoding is unambiguous.
-- TODO: the empty object should be accepted and treated as mempty.
-- TODO: the empty object should be accepted and treated as mempty.
deriveJSON
(
unPrefixUntagged
"_"
)
''
N
gramsPatch
deriveJSON
(
unPrefixUntagged
"_"
)
''
N
gramsPatch
...
@@ -532,6 +544,8 @@ newtype NgramsTablePatch = NgramsTablePatch (PatchMap NgramsTerm NgramsPatch)
...
@@ -532,6 +544,8 @@ newtype NgramsTablePatch = NgramsTablePatch (PatchMap NgramsTerm NgramsPatch)
deriving
stock
(
Eq
,
Show
,
Generic
)
deriving
stock
(
Eq
,
Show
,
Generic
)
deriving
newtype
(
ToJSON
,
FromJSON
,
Semigroup
,
Monoid
,
Validity
,
Transformable
)
deriving
newtype
(
ToJSON
,
FromJSON
,
Semigroup
,
Monoid
,
Validity
,
Transformable
)
instance
NFData
NgramsTablePatch
mkNgramsTablePatch
::
Map
NgramsTerm
NgramsPatch
->
NgramsTablePatch
mkNgramsTablePatch
::
Map
NgramsTerm
NgramsPatch
->
NgramsTablePatch
mkNgramsTablePatch
=
NgramsTablePatch
.
PM
.
fromMap
mkNgramsTablePatch
=
NgramsTablePatch
.
PM
.
fromMap
...
@@ -683,6 +697,8 @@ deriveJSON (unPrefix "_v_") ''Versioned
...
@@ -683,6 +697,8 @@ deriveJSON (unPrefix "_v_") ''Versioned
makeLenses
''
V
ersioned
makeLenses
''
V
ersioned
instance
(
Typeable
a
,
ToSchema
a
)
=>
ToSchema
(
Versioned
a
)
where
instance
(
Typeable
a
,
ToSchema
a
)
=>
ToSchema
(
Versioned
a
)
where
declareNamedSchema
=
wellNamedSchema
"_v_"
declareNamedSchema
=
wellNamedSchema
"_v_"
instance
NFData
a
=>
NFData
(
Versioned
a
)
where
instance
Serialise
a
=>
Serialise
(
Versioned
a
)
where
------------------------------------------------------------------------
------------------------------------------------------------------------
type
Count
=
Int
type
Count
=
Int
...
@@ -697,6 +713,7 @@ deriveJSON (unPrefix "_vc_") ''VersionedWithCount
...
@@ -697,6 +713,7 @@ deriveJSON (unPrefix "_vc_") ''VersionedWithCount
makeLenses
''
V
ersionedWithCount
makeLenses
''
V
ersionedWithCount
instance
(
Typeable
a
,
ToSchema
a
)
=>
ToSchema
(
VersionedWithCount
a
)
where
instance
(
Typeable
a
,
ToSchema
a
)
=>
ToSchema
(
VersionedWithCount
a
)
where
declareNamedSchema
=
wellNamedSchema
"_vc_"
declareNamedSchema
=
wellNamedSchema
"_vc_"
instance
NFData
a
=>
NFData
(
VersionedWithCount
a
)
where
toVersionedWithCount
::
Count
->
Versioned
a
->
VersionedWithCount
a
toVersionedWithCount
::
Count
->
Versioned
a
->
VersionedWithCount
a
toVersionedWithCount
count
(
Versioned
version
data_
)
=
VersionedWithCount
version
count
data_
toVersionedWithCount
count
(
Versioned
version
data_
)
=
VersionedWithCount
version
count
data_
...
...
src/Gargantext/API/Node.hs
View file @
b3a89d3a
...
@@ -27,15 +27,15 @@ Node API
...
@@ -27,15 +27,15 @@ Node API
module
Gargantext.API.Node
module
Gargantext.API.Node
where
where
import
Gargantext.API.Admin.Auth
(
withNamedAccess
,
withNamedPolicyT
,
withPolicy
,
withPolicy
)
import
Gargantext.API.Admin.Auth.Types
(
PathId
(
..
),
AuthenticatedUser
(
..
),
auth_node_id
,
auth_user_id
)
import
Gargantext.API.Admin.Auth.Types
(
PathId
(
..
),
AuthenticatedUser
(
..
),
auth_node_id
,
auth_user_id
)
import
Gargantext.API.Admin.Auth
(
withNamedAccess
,
withNamedPolicyT
,
withPolicy
,
withPolicy
)
import
Gargantext.API.Admin.EnvTypes
(
Env
)
import
Gargantext.API.Admin.EnvTypes
(
Env
)
import
Gargantext.API.Auth.PolicyCheck
(
nodeReadChecks
,
nodeWriteChecks
,
moveChecks
,
AccessPolicyManager
,
publishChecks
)
import
Gargantext.API.Auth.PolicyCheck
(
nodeReadChecks
,
nodeWriteChecks
,
moveChecks
,
AccessPolicyManager
,
publishChecks
)
import
Gargantext.API.Errors.Types
(
BackendInternalError
)
import
Gargantext.API.Errors.Types
(
BackendInternalError
)
import
Gargantext.API.Metrics
import
Gargantext.API.Metrics
import
Gargantext.API.Ngrams.Types
(
TabType
(
..
))
import
Gargantext.API.Ngrams.Types
(
TabType
(
..
))
import
Gargantext.API.Node.DocumentUpload
qualified
as
DocumentUpload
import
Gargantext.API.Node.DocumentsFromWriteNodes
qualified
as
DFWN
import
Gargantext.API.Node.DocumentsFromWriteNodes
qualified
as
DFWN
import
Gargantext.API.Node.DocumentUpload
qualified
as
DocumentUpload
import
Gargantext.API.Node.File
(
fileApi
,
fileAsyncApi
)
import
Gargantext.API.Node.File
(
fileApi
,
fileAsyncApi
)
import
Gargantext.API.Node.FrameCalcUpload
qualified
as
FrameCalcUpload
import
Gargantext.API.Node.FrameCalcUpload
qualified
as
FrameCalcUpload
import
Gargantext.API.Node.New
(
postNode
,
postNodeAsyncAPI
)
import
Gargantext.API.Node.New
(
postNode
,
postNodeAsyncAPI
)
...
@@ -48,8 +48,11 @@ import Gargantext.API.Routes.Named.Node qualified as Named
...
@@ -48,8 +48,11 @@ import Gargantext.API.Routes.Named.Node qualified as Named
import
Gargantext.API.Routes.Named.Private
qualified
as
Named
import
Gargantext.API.Routes.Named.Private
qualified
as
Named
import
Gargantext.API.Routes.Named.Publish
qualified
as
Named
import
Gargantext.API.Routes.Named.Publish
qualified
as
Named
import
Gargantext.API.Routes.Named.Share
qualified
as
Named
import
Gargantext.API.Routes.Named.Share
qualified
as
Named
import
Gargantext.API.Routes.Named.Tree
qualified
as
Named
import
Gargantext.API.Search
qualified
as
Search
import
Gargantext.API.Search
qualified
as
Search
import
Gargantext.API.Server.Named.Ngrams
(
apiNgramsTableCorpus
)
import
Gargantext.API.Server.Named.Ngrams
(
apiNgramsTableCorpus
)
import
Gargantext.API.Server.Named.Remote
qualified
as
Named
import
Gargantext.API.Server.Named.Remote
qualified
as
Remote
import
Gargantext.API.Table
(
tableApi
,
getPair
)
import
Gargantext.API.Table
(
tableApi
,
getPair
)
import
Gargantext.Core.Types.Individu
(
User
(
..
))
import
Gargantext.Core.Types.Individu
(
User
(
..
))
import
Gargantext.Core.Viz.Phylo.API
(
phyloAPI
)
import
Gargantext.Core.Viz.Phylo.API
(
phyloAPI
)
...
@@ -61,17 +64,16 @@ import Gargantext.Database.Admin.Types.Node
...
@@ -61,17 +64,16 @@ import Gargantext.Database.Admin.Types.Node
import
Gargantext.Database.Prelude
(
DBCmdExtra
,
JSONB
)
import
Gargantext.Database.Prelude
(
DBCmdExtra
,
JSONB
)
import
Gargantext.Database.Query.Table.Node
import
Gargantext.Database.Query.Table.Node
import
Gargantext.Database.Query.Table.Node.Children
(
getChildren
)
import
Gargantext.Database.Query.Table.Node.Children
(
getChildren
)
import
Gargantext.Database.Query.Table.Node.Update
(
Update
(
..
),
update
)
import
Gargantext.Database.Query.Table.Node.Update
qualified
as
U
(
update
,
Update
(
..
),
publish
)
import
Gargantext.Database.Query.Table.Node.UpdateOpaleye
(
updateHyperdata
)
import
Gargantext.Database.Query.Table.NodeContext
(
nodeContextsCategory
,
nodeContextsScore
)
import
Gargantext.Database.Query.Table.NodeContext
(
nodeContextsCategory
,
nodeContextsScore
)
import
Gargantext.Database.Query.Table.Node.Error
(
HasNodeError
)
import
Gargantext.Database.Query.Table.NodeNode
import
Gargantext.Database.Query.Table.NodeNode
import
Gargantext.Database.Query.Table.Node.UpdateOpaleye
(
updateHyperdata
)
import
Gargantext.Database.Query.Table.Node.Update
qualified
as
U
(
update
,
Update
(
..
),
publish
)
import
Gargantext.Database.Query.Table.Node.Update
(
Update
(
..
),
update
)
import
Gargantext.Database.Query.Tree
(
tree
,
tree_flat
,
TreeMode
(
..
))
import
Gargantext.Database.Query.Tree
(
tree
,
tree_flat
,
TreeMode
(
..
))
import
Gargantext.Prelude
import
Gargantext.Prelude
import
Servant
import
Servant
import
Servant.Server.Generic
(
AsServerT
)
import
Servant.Server.Generic
(
AsServerT
)
import
Gargantext.API.Routes.Named.Tree
qualified
as
Named
import
Gargantext.Database.Query.Table.Node.Error
(
HasNodeError
)
-- | Delete Nodes
-- | Delete Nodes
...
@@ -215,8 +217,12 @@ corpusNodeAPI authenticatedUser = Named.CorpusAPIEndpoint $ \targetNode ->
...
@@ -215,8 +217,12 @@ corpusNodeAPI authenticatedUser = Named.CorpusAPIEndpoint $ \targetNode ->
------------------------------------------------------------------------
------------------------------------------------------------------------
nodeAPI
::
AuthenticatedUser
nodeAPI
::
AuthenticatedUser
->
Named
.
NodeAPIEndpoint
(
AsServerT
(
GargM
Env
BackendInternalError
))
->
Named
.
NodeAPIEndpoint
(
AsServerT
(
GargM
Env
BackendInternalError
))
nodeAPI
authenticatedUser
=
Named
.
NodeAPIEndpoint
$
\
targetNode
->
nodeAPI
authenticatedUser
=
Named
.
NodeAPIEndpoint
withNamedAccess
authenticatedUser
(
PathNode
targetNode
)
(
concreteAPI
targetNode
)
{
nodeEndpointAPI
=
\
targetNode
->
withNamedAccess
authenticatedUser
(
PathNode
targetNode
)
(
concreteAPI
targetNode
)
,
nodeRemoteImportAPI
=
Named
.
remoteImportAPI
authenticatedUser
}
where
where
concreteAPI
=
genericNodeAPI'
(
Proxy
::
Proxy
HyperdataAny
)
authenticatedUser
concreteAPI
=
genericNodeAPI'
(
Proxy
::
Proxy
HyperdataAny
)
authenticatedUser
...
@@ -268,6 +274,7 @@ genericNodeAPI' _ authenticatedUser targetNode = Named.NodeAPI
...
@@ -268,6 +274,7 @@ genericNodeAPI' _ authenticatedUser targetNode = Named.NodeAPI
,
fileAsyncAPI
=
fileAsyncApi
authenticatedUser
targetNode
,
fileAsyncAPI
=
fileAsyncApi
authenticatedUser
targetNode
,
dfwnAPI
=
DFWN
.
api
authenticatedUser
targetNode
,
dfwnAPI
=
DFWN
.
api
authenticatedUser
targetNode
,
documentUploadAPI
=
DocumentUpload
.
api
targetNode
,
documentUploadAPI
=
DocumentUpload
.
api
targetNode
,
remoteExportAPI
=
Remote
.
remoteExportAPI
targetNode
authenticatedUser
}
}
where
where
userRootId
=
RootId
$
authenticatedUser
^.
auth_node_id
userRootId
=
RootId
$
authenticatedUser
^.
auth_node_id
...
...
src/Gargantext/API/Node/Corpus/New.hs
View file @
b3a89d3a
...
@@ -34,14 +34,14 @@ import Gargantext.API.Node.Corpus.Searx ( triggerSearxSearch )
...
@@ -34,14 +34,14 @@ import Gargantext.API.Node.Corpus.Searx ( triggerSearxSearch )
import
Gargantext.API.Node.Corpus.Types
(
Datafield
(
Web
),
datafield2origin
)
import
Gargantext.API.Node.Corpus.Types
(
Datafield
(
Web
),
datafield2origin
)
import
Gargantext.API.Node.Corpus.Update
(
addLanguageToCorpus
)
import
Gargantext.API.Node.Corpus.Update
(
addLanguageToCorpus
)
import
Gargantext.API.Node.Types
import
Gargantext.API.Node.Types
import
Gargantext.Core
(
withDefaultLanguage
,
defaultLanguage
)
import
Gargantext.Core.Config
(
gc_jobs
,
hasConfig
)
import
Gargantext.Core.Config
(
gc_jobs
,
hasConfig
)
import
Gargantext.Core.Config.Types
(
jc_max_docs_parsers
)
import
Gargantext.Core.Config.Types
(
jc_max_docs_parsers
)
import
Gargantext.Core.NodeStory
(
HasNodeStoryImmediateSaver
,
HasNodeArchiveStoryImmediateSaver
,
currentVersion
,
NgramsStatePatch
'
)
import
Gargantext.Core.NodeStory
(
HasNodeStoryImmediateSaver
,
HasNodeArchiveStoryImmediateSaver
,
currentVersion
,
NgramsStatePatch
'
,
HasNodeStoryEnv
)
import
Gargantext.Core.Text.Corpus.Parsers
qualified
as
Parser
(
FileType
(
..
),
parseFormatC
,
_ParseFormatError
)
import
Gargantext.Core.Text.Corpus.Parsers
qualified
as
Parser
(
FileType
(
..
),
parseFormatC
,
_ParseFormatError
)
import
Gargantext.Core.Text.Corpus.Query
qualified
as
API
import
Gargantext.Core.Text.Corpus.Query
qualified
as
API
import
Gargantext.Core.Types.Individu
(
User
(
..
))
import
Gargantext.Core.Types.Individu
(
User
(
..
))
import
Gargantext.Core.Utils.Prefix
(
unPrefix
)
import
Gargantext.Core.Utils.Prefix
(
unPrefix
)
import
Gargantext.Core
(
withDefaultLanguage
,
defaultLanguage
)
import
Gargantext.Database.Action.Flow
(
flowCorpus
,
getDataText
,
flowDataText
,
TermType
(
..
)
{-, allDataOrigins-}
)
import
Gargantext.Database.Action.Flow
(
flowCorpus
,
getDataText
,
flowDataText
,
TermType
(
..
)
{-, allDataOrigins-}
)
import
Gargantext.Database.Action.Flow.Types
(
FlowCmdM
)
import
Gargantext.Database.Action.Flow.Types
(
FlowCmdM
)
import
Gargantext.Database.Action.Mail
(
sendMail
)
import
Gargantext.Database.Action.Mail
(
sendMail
)
...
@@ -51,7 +51,8 @@ import Gargantext.Database.Admin.Types.Hyperdata.Document ( ToHyperdataDocument(
...
@@ -51,7 +51,8 @@ import Gargantext.Database.Admin.Types.Hyperdata.Document ( ToHyperdataDocument(
import
Gargantext.Database.Admin.Types.Hyperdata.File
(
HyperdataFile
(
..
)
)
import
Gargantext.Database.Admin.Types.Hyperdata.File
(
HyperdataFile
(
..
)
)
import
Gargantext.Database.Admin.Types.Node
(
CorpusId
,
NodeType
(
..
),
ParentId
)
import
Gargantext.Database.Admin.Types.Node
(
CorpusId
,
NodeType
(
..
),
ParentId
)
import
Gargantext.Database.GargDB
qualified
as
GargDB
import
Gargantext.Database.GargDB
qualified
as
GargDB
import
Gargantext.Database.Prelude
(
readLargeObject
)
import
Gargantext.Database.Prelude
(
readLargeObject
,
IsDBCmd
)
import
Gargantext.Database.Query.Table.Node.Error
(
HasNodeError
)
import
Gargantext.Database.Query.Table.Node
(
getNodeWith
,
getOrMkList
)
import
Gargantext.Database.Query.Table.Node
(
getNodeWith
,
getOrMkList
)
import
Gargantext.Database.Query.Table.Node.UpdateOpaleye
(
updateHyperdata
)
import
Gargantext.Database.Query.Table.Node.UpdateOpaleye
(
updateHyperdata
)
import
Gargantext.Database.Query.Tree.Root
(
MkCorpusUser
(
MkCorpusUserNormalCorpusIds
))
import
Gargantext.Database.Query.Tree.Root
(
MkCorpusUser
(
MkCorpusUserNormalCorpusIds
))
...
@@ -360,11 +361,15 @@ addToCorpusWithFile user cid nwf@(NewWithFile _d (withDefaultLanguage -> l) fNam
...
@@ -360,11 +361,15 @@ addToCorpusWithFile user cid nwf@(NewWithFile _d (withDefaultLanguage -> l) fNam
--- UTILITIES
--- UTILITIES
commitCorpus
::
(
FlowCmdM
env
err
m
commitCorpus
::
(
IsDBCmd
env
err
m
,
HasNodeStoryEnv
env
,
HasNodeError
err
,
HasNodeArchiveStoryImmediateSaver
env
,
HasNodeArchiveStoryImmediateSaver
env
,
HasNodeStoryImmediateSaver
env
)
,
HasNodeStoryImmediateSaver
env
)
=>
ParentId
->
User
->
m
(
Versioned
NgramsStatePatch'
)
=>
ParentId
commitCorpus
cid
user
=
do
->
User
->
m
(
Versioned
NgramsStatePatch'
)
commitCorpus
cid
user
=
do
userId
<-
getUserId
user
userId
<-
getUserId
user
listId
<-
getOrMkList
cid
userId
listId
<-
getOrMkList
cid
userId
v
<-
currentVersion
listId
v
<-
currentVersion
listId
...
...
src/Gargantext/API/Node/Corpus/Subcorpus.hs
View file @
b3a89d3a
module
Gargantext.API.Node.Corpus.Subcorpus
where
module
Gargantext.API.Node.Corpus.Subcorpus
where
import
Data.Validity
(
Validation
(
..
),
ValidationChain
(
..
))
import
Gargantext.Prelude
import
Gargantext.Prelude
import
Gargantext.API.Errors.Types
(
BackendInternalError
(
..
))
import
Gargantext.API.Errors.Types
(
BackendInternalError
)
import
Gargantext.API.Routes.Named.Corpus
(
MakeSubcorpusAPI
(
..
),
SubcorpusParams
(
..
))
import
Gargantext.API.Routes.Named.Corpus
(
MakeSubcorpusAPI
(
..
),
SubcorpusParams
(
..
))
import
Gargantext.Core.NodeStory.Types
(
HasNodeStoryEnv
)
import
Gargantext.Core.NodeStory.Types
(
HasNodeStoryEnv
)
import
Gargantext.Core.NLP
(
HasNLPServer
)
import
Gargantext.Core.NLP
(
HasNLPServer
)
...
@@ -21,12 +21,14 @@ makeSubcorpus :: ( HasNodeStoryEnv env
...
@@ -21,12 +21,14 @@ makeSubcorpus :: ( HasNodeStoryEnv env
=>
UserId
=>
UserId
->
MakeSubcorpusAPI
(
AsServerT
m
)
->
MakeSubcorpusAPI
(
AsServerT
m
)
makeSubcorpus
user
=
MakeSubcorpusAPI
$
\
corpusId
params
->
do
makeSubcorpus
user
=
MakeSubcorpusAPI
$
\
corpusId
params
->
do
case
parseQuery
(
RawQuery
$
_subcorpusParams_query
params
)
of
let
queryText
=
_subcorpusParams_query
params
Left
_
->
return
False
case
parseQuery
(
RawQuery
queryText
)
of
Right
q
->
do
Left
msg
->
throwError
$
InternalValidationError
$
Validation
[
Violated
$
_
<-
makeSubcorpusFromQuery
"Failed to parse the query "
<>
show
queryText
<>
": "
<>
msg
]
Right
q
->
do
subcorpusId
<-
makeSubcorpusFromQuery
(
UserDBId
user
)
(
UserDBId
user
)
corpusId
corpusId
q
q
(
_subcorpusParams_reuseParentList
params
)
(
_subcorpusParams_reuseParentList
params
)
return
True
return
subcorpusId
src/Gargantext/API/Node/Document/Export.hs
View file @
b3a89d3a
...
@@ -9,6 +9,10 @@ Portability : POSIX
...
@@ -9,6 +9,10 @@ Portability : POSIX
-}
-}
module
Gargantext.API.Node.Document.Export
module
Gargantext.API.Node.Document.Export
(
documentExportAPI
-- * Internals
,
get_document_json
)
where
where
import
Control.Lens
(
view
)
import
Control.Lens
(
view
)
...
@@ -20,7 +24,7 @@ import Data.Time.Clock.System (getSystemTime, systemSeconds)
...
@@ -20,7 +24,7 @@ import Data.Time.Clock.System (getSystemTime, systemSeconds)
import
Data.Time.LocalTime
(
getCurrentTimeZone
,
TimeZone
(
timeZoneMinutes
))
import
Data.Time.LocalTime
(
getCurrentTimeZone
,
TimeZone
(
timeZoneMinutes
))
import
Data.Version
(
showVersion
)
import
Data.Version
(
showVersion
)
import
Gargantext.API.Node.Document.Export.Types
import
Gargantext.API.Node.Document.Export.Types
import
Gargantext.API.Prelude
(
GargNoServer
,
IsGargServer
)
import
Gargantext.API.Prelude
(
IsGargServer
)
import
Gargantext.API.Routes.Named.Document
qualified
as
Named
import
Gargantext.API.Routes.Named.Document
qualified
as
Named
import
Gargantext.Core
(
toDBid
)
import
Gargantext.Core
(
toDBid
)
import
Gargantext.Database.Admin.Types.Node
(
DocId
,
NodeId
,
NodeType
(
..
))
import
Gargantext.Database.Admin.Types.Node
(
DocId
,
NodeId
,
NodeType
(
..
))
...
@@ -46,21 +50,26 @@ documentExportAPI userNodeId dId = Named.DocumentExportAPI $ Named.DocumentExpor
...
@@ -46,21 +50,26 @@ documentExportAPI userNodeId dId = Named.DocumentExportAPI $ Named.DocumentExpor
--------------------------------------------------
--------------------------------------------------
-- | Hashes are ordered by Set
-- | Hashes are ordered by Set
getDocumentsJSON
::
NodeId
getDocumentsJSON
::
IsGargServer
env
err
m
=>
NodeId
-- ^ The ID of the target user
-- ^ The ID of the target user
->
DocId
->
DocId
->
GargNoServer
(
Headers
'[
H
eader
"Content-Disposition"
T
.
Text
]
DocumentExport
)
->
m
(
Headers
'[
H
eader
"Content-Disposition"
T
.
Text
]
DocumentExport
)
getDocumentsJSON
nodeUserId
pId
=
do
getDocumentsJSON
nodeUserId
pId
=
do
uId
<-
view
node_user_id
<$>
getNodeUser
nodeUserId
dexp
<-
get_document_json
nodeUserId
pId
mcId
<-
getClosestParentIdByType
pId
NodeCorpus
let
cId
=
maybe
(
panicTrace
"[G.A.N.D.Export] Node has no parent"
)
identity
mcId
docs
<-
runViewDocuments
cId
False
Nothing
Nothing
Nothing
Nothing
Nothing
let
dexp
=
DocumentExport
{
_de_documents
=
mapFacetDoc
uId
<$>
docs
,
_de_garg_version
=
T
.
pack
$
showVersion
PG
.
version
}
pure
$
addHeader
(
T
.
concat
[
"attachment; filename="
pure
$
addHeader
(
T
.
concat
[
"attachment; filename="
,
"GarganText_DocsList-"
,
"GarganText_DocsList-"
,
T
.
pack
(
show
pId
)
,
T
.
pack
(
show
pId
)
,
".json"
])
dexp
,
".json"
])
dexp
get_document_json
::
IsGargServer
err
env
m
=>
NodeId
->
DocId
->
m
DocumentExport
get_document_json
nodeUserId
pId
=
do
uId
<-
view
node_user_id
<$>
getNodeUser
nodeUserId
mcId
<-
getClosestParentIdByType
pId
NodeCorpus
let
cId
=
maybe
(
panicTrace
"[G.A.N.D.Export] Node has no parent"
)
identity
mcId
docs
<-
runViewDocuments
cId
False
Nothing
Nothing
Nothing
Nothing
Nothing
pure
DocumentExport
{
_de_documents
=
mapFacetDoc
uId
<$>
docs
,
_de_garg_version
=
T
.
pack
$
showVersion
PG
.
version
}
where
where
mapFacetDoc
uId
(
FacetDoc
{
..
})
=
mapFacetDoc
uId
(
FacetDoc
{
..
})
=
Document
{
_d_document
=
Document
{
_d_document
=
...
@@ -80,10 +89,11 @@ getDocumentsJSON nodeUserId pId = do
...
@@ -80,10 +89,11 @@ getDocumentsJSON nodeUserId pId = do
,
_ng_hash
=
""
}
,
_ng_hash
=
""
}
,
_d_hash
=
""
}
,
_d_hash
=
""
}
getDocumentsJSONZip
::
NodeId
getDocumentsJSONZip
::
IsGargServer
env
err
m
=>
NodeId
-- ^ The Node ID of the target user
-- ^ The Node ID of the target user
->
DocId
->
DocId
->
GargNoServer
(
Headers
'[
H
eader
"Content-Disposition"
T
.
Text
]
DocumentExportZIP
)
-- [Document]
->
m
(
Headers
'[
H
eader
"Content-Disposition"
T
.
Text
]
DocumentExportZIP
)
-- [Document]
getDocumentsJSONZip
userNodeId
pId
=
do
getDocumentsJSONZip
userNodeId
pId
=
do
dJSON
<-
getDocumentsJSON
userNodeId
pId
dJSON
<-
getDocumentsJSON
userNodeId
pId
systime
<-
liftBase
getSystemTime
systime
<-
liftBase
getSystemTime
...
@@ -98,10 +108,11 @@ getDocumentsJSONZip userNodeId pId = do
...
@@ -98,10 +108,11 @@ getDocumentsJSONZip userNodeId pId = do
,
dezFileName
dexpz
,
dezFileName
dexpz
,
".zip"
])
dexpz
,
".zip"
])
dexpz
getDocumentsTSV
::
NodeId
getDocumentsTSV
::
IsGargServer
err
env
m
=>
NodeId
-- ^ The Node ID of the target user
-- ^ The Node ID of the target user
->
DocId
->
DocId
->
GargNoServer
(
Headers
'[
H
eader
"Content-Disposition"
T
.
Text
]
T
.
Text
)
-- [Document]
->
m
(
Headers
'[
H
eader
"Content-Disposition"
T
.
Text
]
T
.
Text
)
-- [Document]
getDocumentsTSV
userNodeId
pId
=
do
getDocumentsTSV
userNodeId
pId
=
do
dJSON
<-
getDocumentsJSON
userNodeId
pId
dJSON
<-
getDocumentsJSON
userNodeId
pId
let
DocumentExport
{
_de_documents
}
=
getResponse
dJSON
let
DocumentExport
{
_de_documents
}
=
getResponse
dJSON
...
...
src/Gargantext/API/Node/Document/Export/Types.hs
View file @
b3a89d3a
...
@@ -13,12 +13,13 @@ Portability : POSIX
...
@@ -13,12 +13,13 @@ Portability : POSIX
module
Gargantext.API.Node.Document.Export.Types
where
module
Gargantext.API.Node.Document.Export.Types
where
import
Codec.Serialise.Class
hiding
(
encode
)
import
Data.Aeson
(
encode
)
import
Data.Aeson
(
encode
)
import
Data.Aeson.TH
(
deriveJSON
)
import
Data.Aeson.TH
(
deriveJSON
)
import
Data.Csv
(
DefaultOrdered
(
..
),
ToNamedRecord
(
..
),
(
.=
),
header
,
namedRecord
)
import
Data.Csv
(
DefaultOrdered
(
..
),
ToNamedRecord
(
..
),
(
.=
),
header
,
namedRecord
)
import
Data.Swagger
(
genericDeclareNamedSchema
,
ToParamSchema
(
..
),
ToSchema
(
..
)
)
import
Data.Swagger
(
genericDeclareNamedSchema
,
ToParamSchema
(
..
),
ToSchema
(
..
)
)
import
Data.Text
qualified
as
T
import
Data.Text.Encoding
qualified
as
TE
import
Data.Text.Encoding
qualified
as
TE
import
Data.Text
qualified
as
T
import
Gargantext.Core.Types
(
Node
,
TODO
)
import
Gargantext.Core.Types
(
Node
,
TODO
)
import
Gargantext.Core.Utils.Prefix
(
unPrefix
,
unPrefixSwagger
)
import
Gargantext.Core.Utils.Prefix
(
unPrefix
,
unPrefixSwagger
)
import
Gargantext.Database.Admin.Types.Hyperdata.Document
(
HyperdataDocument
(
..
)
)
import
Gargantext.Database.Admin.Types.Hyperdata.Document
(
HyperdataDocument
(
..
)
)
...
@@ -28,27 +29,37 @@ import Gargantext.Utils.Servant (ZIP)
...
@@ -28,27 +29,37 @@ import Gargantext.Utils.Servant (ZIP)
import
Gargantext.Utils.Zip
(
zipContentsPureWithLastModified
)
import
Gargantext.Utils.Zip
(
zipContentsPureWithLastModified
)
import
Protolude
import
Protolude
import
Servant
(
MimeRender
(
..
),
MimeUnrender
(
..
))
import
Servant
(
MimeRender
(
..
),
MimeUnrender
(
..
))
import
Prelude
(
show
)
-- | Document Export
-- | Document Export
data
DocumentExport
=
data
DocumentExport
=
DocumentExport
{
_de_documents
::
[
Document
]
DocumentExport
{
_de_documents
::
[
Document
]
,
_de_garg_version
::
Text
,
_de_garg_version
::
Text
}
deriving
(
Generic
)
}
deriving
(
Generic
,
Show
,
Eq
)
instance
Serialise
DocumentExport
where
-- | This is to represent a zipped document export. We want to have doc_id in zipped file name.
-- | This is to represent a zipped document export. We want to have doc_id in zipped file name.
data
DocumentExportZIP
=
data
DocumentExportZIP
=
DocumentExportZIP
{
_dez_dexp
::
DocumentExport
DocumentExportZIP
{
_dez_dexp
::
DocumentExport
,
_dez_doc_id
::
DocId
,
_dez_doc_id
::
DocId
,
_dez_last_modified
::
Integer
}
deriving
(
Generic
)
,
_dez_last_modified
::
Integer
}
deriving
(
Generic
)
data
Document
=
data
Document
=
Document
{
_d_document
::
Node
HyperdataDocument
Document
{
_d_document
::
Node
HyperdataDocument
,
_d_ngrams
::
Ngrams
,
_d_ngrams
::
Ngrams
,
_d_hash
::
Hash
,
_d_hash
::
Hash
}
deriving
(
Generic
)
}
deriving
(
Generic
)
instance
Eq
Document
where
(
Document
_
_
h1
)
==
(
Document
_
_
h2
)
=
h1
==
h2
-- compare by their hashes
instance
Show
Document
where
show
(
Document
_
_
h1
)
=
"Document "
<>
Prelude
.
show
h1
instance
Serialise
Document
where
--instance Read Document where
--instance Read Document where
-- read "" = panic "not implemented"
-- read "" = panic "not implemented"
instance
DefaultOrdered
Document
where
instance
DefaultOrdered
Document
where
...
@@ -102,6 +113,8 @@ instance ToParamSchema Document where
...
@@ -102,6 +113,8 @@ instance ToParamSchema Document where
instance
ToParamSchema
Ngrams
where
instance
ToParamSchema
Ngrams
where
toParamSchema
_
=
toParamSchema
(
Proxy
::
Proxy
TODO
)
toParamSchema
_
=
toParamSchema
(
Proxy
::
Proxy
TODO
)
instance
Serialise
Ngrams
where
$
(
deriveJSON
(
unPrefix
"_ng_"
)
''
N
grams
)
$
(
deriveJSON
(
unPrefix
"_ng_"
)
''
N
grams
)
$
(
deriveJSON
(
unPrefix
"_d_"
)
''
D
ocument
)
$
(
deriveJSON
(
unPrefix
"_d_"
)
''
D
ocument
)
$
(
deriveJSON
(
unPrefix
"_de_"
)
''
D
ocumentExport
)
$
(
deriveJSON
(
unPrefix
"_de_"
)
''
D
ocumentExport
)
...
@@ -113,7 +126,7 @@ $(deriveJSON (unPrefix "_de_") ''DocumentExport)
...
@@ -113,7 +126,7 @@ $(deriveJSON (unPrefix "_de_") ''DocumentExport)
-- Needs to be here because of deriveJSON TH above
-- Needs to be here because of deriveJSON TH above
dezFileName
::
DocumentExportZIP
->
Text
dezFileName
::
DocumentExportZIP
->
Text
dezFileName
(
DocumentExportZIP
{
..
})
=
"GarganText_DocsList-"
<>
show
_dez_doc_id
<>
".json"
dezFileName
(
DocumentExportZIP
{
..
})
=
"GarganText_DocsList-"
<>
Protolude
.
show
_dez_doc_id
<>
".json"
instance
MimeRender
ZIP
DocumentExportZIP
where
instance
MimeRender
ZIP
DocumentExportZIP
where
mimeRender
_
dexpz
@
(
DocumentExportZIP
{
..
})
=
mimeRender
_
dexpz
@
(
DocumentExportZIP
{
..
})
=
...
...
src/Gargantext/API/Node/DocumentUpload.hs
View file @
b3a89d3a
...
@@ -11,33 +11,43 @@ Portability : POSIX
...
@@ -11,33 +11,43 @@ Portability : POSIX
{-# LANGUAGE AllowAmbiguousTypes #-}
{-# LANGUAGE AllowAmbiguousTypes #-}
{-# LANGUAGE MonoLocalBinds #-}
{-# LANGUAGE MonoLocalBinds #-}
{-# LANGUAGE TemplateHaskell #-}
module
Gargantext.API.Node.DocumentUpload
where
module
Gargantext.API.Node.DocumentUpload
where
import
Control.Lens
(
view
)
import
Control.Lens
(
view
)
import
Data.Text
qualified
as
T
import
Data.Text
qualified
as
T
import
Gargantext.API.Admin.Auth.Types
(
AuthenticatedUser
(
..
))
import
Gargantext.API.Admin.EnvTypes
(
Env
)
import
Gargantext.API.Admin.EnvTypes
(
Env
)
import
Gargantext.API.Errors.Types
(
BackendInternalError
)
import
Gargantext.API.Errors.Types
(
BackendInternalError
(
..
)
)
import
Gargantext.API.Node.Corpus.New
(
commitCorpus
)
import
Gargantext.API.Node.Document.Export.Types
(
Document
(
..
))
import
Gargantext.API.Node.DocumentUpload.Types
import
Gargantext.API.Node.DocumentUpload.Types
import
Gargantext.API.Prelude
(
GargM
)
import
Gargantext.API.Prelude
(
GargM
)
import
Gargantext.API.Routes.Named.Document
qualified
as
Named
import
Gargantext.API.Routes.Named.Document
qualified
as
Named
import
Gargantext.API.Worker
(
serveWorkerAPI
)
import
Gargantext.API.Worker
(
serveWorkerAPI
)
import
Gargantext.Core
(
Lang
(
..
))
import
Gargantext.Core
(
Lang
(
..
))
import
Gargantext.Core.NLP
(
nlpServerGet
)
import
Gargantext.Core.NLP
(
nlpServerGet
,
HasNLPServer
)
import
Gargantext.Core.NodeStory.Types
(
HasNodeStoryEnv
,
HasNodeArchiveStoryImmediateSaver
)
import
Gargantext.Core.Text.Corpus.Parsers.Date
(
mDateSplit
)
import
Gargantext.Core.Text.Corpus.Parsers.Date
(
mDateSplit
)
import
Gargantext.Core.Text.Terms
(
TermType
(
..
))
import
Gargantext.Core.Text.Terms
(
TermType
(
..
))
import
Gargantext.Core.Types.Individu
(
User
(
..
))
import
Gargantext.Core.Worker.Jobs.Types
qualified
as
Jobs
import
Gargantext.Core.Worker.Jobs.Types
qualified
as
Jobs
import
Gargantext.Core.Worker.Jobs.Types
(
WorkSplit
(
..
))
import
Gargantext.Database.Action.Flow
(
addDocumentsToHyperCorpus
)
import
Gargantext.Database.Action.Flow
(
addDocumentsToHyperCorpus
)
import
Gargantext.Database.Action.Flow.Types
(
FlowCmdM
)
import
Gargantext.Database.Action.Flow.Types
(
FlowCmdM
)
import
Gargantext.Database.Admin.Types.Hyperdata.Corpus
(
HyperdataCorpus
)
import
Gargantext.Database.Admin.Types.Hyperdata.Corpus
(
HyperdataCorpus
)
import
Gargantext.Database.Admin.Types.Hyperdata.Document
(
HyperdataDocument
(
..
))
import
Gargantext.Database.Admin.Types.Hyperdata.Document
(
HyperdataDocument
(
..
))
import
Gargantext.Database.Admin.Types.Node
(
DocId
,
NodeId
,
NodeType
(
NodeCorpus
)
)
import
Gargantext.Database.Admin.Types.Node
(
DocId
,
NodeId
,
NodeType
(
NodeCorpus
),
ParentId
)
import
Gargantext.Database.Prelude
(
IsDBCmd
)
import
Gargantext.Database.Query.Table.Node.Error
(
HasNodeError
)
import
Gargantext.Database.Query.Table.Node
(
getClosestParentIdByType'
)
import
Gargantext.Database.Query.Table.Node
(
getClosestParentIdByType'
)
import
Gargantext.Database.Schema.Node
(
_node_hyperdata
)
import
Gargantext.Prelude
import
Gargantext.Prelude
import
Gargantext.System.Logging
(
logLocM
,
LogLevel
(
..
),
MonadLogger
)
import
Gargantext.Utils.Jobs.Monad
(
MonadJobStatus
(
..
))
import
Gargantext.Utils.Jobs.Monad
(
MonadJobStatus
(
..
))
import
Servant.Server.Generic
(
AsServerT
)
import
Servant.Server.Generic
(
AsServerT
)
api
::
NodeId
->
Named
.
DocumentUploadAPI
(
AsServerT
(
GargM
Env
BackendInternalError
))
api
::
NodeId
->
Named
.
DocumentUploadAPI
(
AsServerT
(
GargM
Env
BackendInternalError
))
api
nId
=
Named
.
DocumentUploadAPI
{
api
nId
=
Named
.
DocumentUploadAPI
{
uploadDocAsyncEp
=
serveWorkerAPI
$
\
p
->
uploadDocAsyncEp
=
serveWorkerAPI
$
\
p
->
...
@@ -91,3 +101,30 @@ documentUpload nId doc = do
...
@@ -91,3 +101,30 @@ documentUpload nId doc = do
let
lang
=
EN
let
lang
=
EN
ncs
<-
view
$
nlpServerGet
lang
ncs
<-
view
$
nlpServerGet
lang
addDocumentsToHyperCorpus
ncs
(
Nothing
::
Maybe
HyperdataCorpus
)
(
Multi
lang
)
cId
[
hd
]
addDocumentsToHyperCorpus
ncs
(
Nothing
::
Maybe
HyperdataCorpus
)
(
Multi
lang
)
cId
[
hd
]
-- | Imports the documents contained into this 'DocumentExport' into this (local) version
-- of the running node.
-- /NOTE(adn)/: We should compare the gargantext version and ensure that we are importing
-- only compatible versions.
remoteImportDocuments
::
(
HasNodeError
err
,
HasNLPServer
env
,
HasNodeArchiveStoryImmediateSaver
env
,
HasNodeStoryEnv
env
,
IsDBCmd
env
err
m
,
MonadLogger
m
,
MonadIO
m
)
=>
AuthenticatedUser
->
ParentId
->
NodeId
->
WorkSplit
->
[
Document
]
-- ^ Total docs
->
m
[
NodeId
]
remoteImportDocuments
loggedInUser
corpusId
nodeId
WorkSplit
{
..
}
documents
=
do
let
la
=
Multi
EN
nlpServerConfig
<-
view
$
nlpServerGet
(
_tt_lang
la
)
$
(
logLocM
)
INFO
$
"Importing "
<>
T
.
pack
(
show
_ws_current
)
<>
"/"
<>
T
.
pack
(
show
_ws_total
)
<>
" documents for corpus node "
<>
T
.
pack
(
show
nodeId
)
docs
<-
addDocumentsToHyperCorpus
nlpServerConfig
(
Nothing
::
Maybe
HyperdataCorpus
)
la
corpusId
(
map
(
_node_hyperdata
.
_d_document
)
documents
)
_versioned
<-
commitCorpus
corpusId
(
RootId
$
_auth_node_id
loggedInUser
)
$
(
logLocM
)
INFO
$
"Done importing "
<>
T
.
pack
(
show
_ws_current
)
<>
"/"
<>
T
.
pack
(
show
_ws_total
)
<>
" documents for corpus node "
<>
T
.
pack
(
show
nodeId
)
pure
docs
src/Gargantext/API/Node/Share.hs
View file @
b3a89d3a
...
@@ -55,25 +55,21 @@ api userInviting nId (ShareTeamParams user') = do
...
@@ -55,25 +55,21 @@ api userInviting nId (ShareTeamParams user') = do
pure
u
pure
u
Left
_err
->
do
Left
_err
->
do
username'
<-
getUsername
userInviting
username'
<-
getUsername
userInviting
if
username'
`
List
.
elem
`
arbitraryUsername
unless
(
username'
`
List
.
elem
`
arbitraryUsername
)
$
do
then
do
-- TODO better analysis of the composition of what is shared
-- printDebug "[G.A.N.Share.api]" ("Demo users are not allowed to invite" :: Text)
children
<-
findNodesWithType
nId
[
NodeList
]
[
NodeFolderShared
pure
()
,
NodeTeam
else
do
,
NodeFolder
-- TODO better analysis of the composition of what is shared
,
NodeCorpus
children
<-
findNodesWithType
nId
[
NodeList
]
[
NodeFolderShared
]
,
NodeTeam
_
<-
if
List
.
null
children
,
NodeFolder
then
do
,
NodeCorpus
-- printDebug "[G.A.N.Share.api]" ("Invitation is enabled if you share a corpus at least" :: Text)
]
pure
$
UnsafeMkUserId
0
_
<-
if
List
.
null
children
else
do
then
do
-- printDebug "[G.A.N.Share.api]" ("Your invitation is sent to: " <> user'')
-- printDebug "[G.A.N.Share.api]" ("Invitation is enabled if you share a corpus at least" :: Text)
newUser
user''
pure
$
UnsafeMkUserId
0
pure
()
else
do
-- printDebug "[G.A.N.Share.api]" ("Your invitation is sent to: " <> user'')
newUser
user''
pure
()
pure
u
pure
u
fromIntegral
<$>
DB
.
shareNodeWith
(
ShareNodeWith_User
NodeFolderShared
(
UserName
user
))
nId
fromIntegral
<$>
DB
.
shareNodeWith
(
ShareNodeWith_User
NodeFolderShared
(
UserName
user
))
nId
...
...
src/Gargantext/API/Prelude.hs
View file @
b3a89d3a
...
@@ -17,13 +17,14 @@ module Gargantext.API.Prelude
...
@@ -17,13 +17,14 @@ module Gargantext.API.Prelude
,
HasServerError
(
..
)
,
HasServerError
(
..
)
,
serverError
)
where
,
serverError
)
where
import
Control.Exception.Safe
qualified
as
Safe
import
Control.Lens
((
#
))
import
Control.Lens
((
#
))
import
Control.Monad.Random
(
MonadRandom
)
import
Control.Monad.Random
(
MonadRandom
)
import
Gargantext.API.Admin.Auth.Types
(
AuthenticationError
)
import
Gargantext.API.Admin.Auth.Types
(
AuthenticationError
)
import
Gargantext.API.Errors.Class
(
HasAuthenticationError
,
_AuthenticationError
)
import
Gargantext.API.Errors.Class
(
HasAuthenticationError
,
_AuthenticationError
)
import
Gargantext.API.Errors.Types
(
HasServerError
(
..
),
serverError
)
import
Gargantext.API.Errors.Types
(
HasServerError
(
..
),
serverError
,
HasBackendInternalError
)
import
Gargantext.Core.Notifications.CentralExchange.Types
(
HasCentralExchangeNotification
)
import
Gargantext.Core.Notifications.CentralExchange.Types
(
HasCentralExchangeNotification
)
import
Gargantext.Core.Config
(
HasConfig
)
import
Gargantext.Core.Config
(
HasConfig
,
HasManager
)
import
Gargantext.Core.Mail.Types
(
HasMail
)
import
Gargantext.Core.Mail.Types
(
HasMail
)
import
Gargantext.Core.NLP
(
HasNLPServer
)
import
Gargantext.Core.NLP
(
HasNLPServer
)
import
Gargantext.Core.NodeStory
(
HasNodeStory
,
HasNodeStoryEnv
)
import
Gargantext.Core.NodeStory
(
HasNodeStory
,
HasNodeStoryEnv
)
...
@@ -45,6 +46,7 @@ type EnvC env =
...
@@ -45,6 +46,7 @@ type EnvC env =
,
HasNodeStoryEnv
env
,
HasNodeStoryEnv
env
,
HasMail
env
,
HasMail
env
,
HasNLPServer
env
,
HasNLPServer
env
,
HasManager
env
,
HasCentralExchangeNotification
env
,
HasCentralExchangeNotification
env
)
)
...
@@ -53,6 +55,7 @@ type ErrC err =
...
@@ -53,6 +55,7 @@ type ErrC err =
,
HasValidationError
err
,
HasValidationError
err
,
HasTreeError
err
,
HasTreeError
err
,
HasServerError
err
,
HasServerError
err
,
HasBackendInternalError
err
,
HasAuthenticationError
err
,
HasAuthenticationError
err
-- , ToJSON err -- TODO this is arguable
-- , ToJSON err -- TODO this is arguable
,
Exception
err
,
Exception
err
...
@@ -62,6 +65,7 @@ type GargServerC env err m =
...
@@ -62,6 +65,7 @@ type GargServerC env err m =
(
HasNodeStory
env
err
m
(
HasNodeStory
env
err
m
,
HasMail
env
,
HasMail
env
,
MonadRandom
m
,
MonadRandom
m
,
Safe
.
MonadCatch
m
,
EnvC
env
,
EnvC
env
,
ErrC
err
,
ErrC
err
,
ToJSON
err
,
ToJSON
err
...
...
src/Gargantext/API/Routes/Client.hs
0 → 100644
View file @
b3a89d3a
{-# LANGUAGE TypeFamilies #-}
{-# LANGUAGE InstanceSigs #-}
{-# OPTIONS_GHC -fno-warn-orphans #-}
{-# LANGUAGE ViewPatterns #-}
module
Gargantext.API.Routes.Client
where
import
Conduit
qualified
as
C
import
Data.Proxy
import
Data.Text.Encoding
qualified
as
TE
import
Gargantext.API.Admin.Auth.Types
qualified
as
Auth
import
Gargantext.API.Errors
(
GargErrorScheme
(
..
))
import
Gargantext.API.Routes.Named
import
Gargantext.API.Routes.Named.Node
import
Gargantext.API.Routes.Named.Private
import
Gargantext.API.Routes.Named.Remote
qualified
as
Named
import
Gargantext.Database.Admin.Types.Node
import
Gargantext.Prelude
import
Network.HTTP.Types
qualified
as
H
import
Servant.API.WebSocket
qualified
as
WS
import
Servant.Auth.Client
qualified
as
S
import
Servant.Client.Core
import
Servant.Client.Generic
(
genericClient
)
import
Servant.Client.Streaming
import
Servant.Conduit
()
instance
RunClient
m
=>
HasClient
m
WS
.
WebSocketPending
where
type
Client
m
WS
.
WebSocketPending
=
H
.
Method
->
m
()
clientWithRoute
::
Proxy
m
->
Proxy
WS
.
WebSocketPending
->
Request
->
Client
m
WS
.
WebSocketPending
clientWithRoute
_pm
Proxy
_req
_httpMethod
=
do
panicTrace
"[WebSocket client] this is not implemented!"
hoistClientMonad
_
_
f
cl
=
\
meth
->
f
(
cl
meth
)
-- | The client for the full API. It also serves as a \"proof\" that our
-- whole API has all the required instances to be used in a client.
clientRoutes
::
API
(
AsClientT
ClientM
)
clientRoutes
=
genericClient
remoteImportClient
::
Auth
.
Token
->
C
.
ConduitT
()
Named
.
RemoteBinaryData
IO
()
->
ClientM
[
NodeId
]
remoteImportClient
(
S
.
Token
.
TE
.
encodeUtf8
->
token
)
c
=
clientRoutes
&
apiWithCustomErrorScheme
&
(
$
GES_new
)
&
backendAPI
&
backendAPI'
&
mkBackEndAPI
&
gargAPIVersion
&
gargPrivateAPI
&
mkPrivateAPI
&
(
$
token
)
&
nodeEp
&
nodeRemoteImportAPI
&
Named
.
remoteImportEp
&
(
$
c
)
remoteExportClient
::
Auth
.
Token
->
NodeId
->
Named
.
RemoteExportRequest
->
ClientM
[
NodeId
]
remoteExportClient
(
S
.
Token
.
TE
.
encodeUtf8
->
token
)
nodeId
r
=
clientRoutes
&
apiWithCustomErrorScheme
&
(
$
GES_new
)
&
backendAPI
&
backendAPI'
&
mkBackEndAPI
&
gargAPIVersion
&
gargPrivateAPI
&
mkPrivateAPI
&
(
$
token
)
&
nodeEp
&
nodeEndpointAPI
&
(
$
nodeId
)
&
remoteExportAPI
&
Named
.
remoteExportEp
&
(
$
r
)
src/Gargantext/API/Routes/Named/Corpus.hs
View file @
b3a89d3a
...
@@ -30,7 +30,7 @@ import Gargantext.API.Node.Types (NewWithForm, WithQuery)
...
@@ -30,7 +30,7 @@ import Gargantext.API.Node.Types (NewWithForm, WithQuery)
import
Gargantext.API.Worker
(
WorkerAPI
)
import
Gargantext.API.Worker
(
WorkerAPI
)
import
Gargantext.Core.Text.Ngrams
(
NgramsType
(
..
))
import
Gargantext.Core.Text.Ngrams
(
NgramsType
(
..
))
import
Gargantext.Core.Utils.Prefix
(
unPrefix
,
unPrefixSwagger
)
import
Gargantext.Core.Utils.Prefix
(
unPrefix
,
unPrefixSwagger
)
import
Gargantext.Database.Admin.Types.Node
(
CorpusId
,
ListId
)
import
Gargantext.Database.Admin.Types.Node
(
CorpusId
,
ListId
,
NodeId
)
import
Gargantext.Prelude
(
Bool
)
import
Gargantext.Prelude
(
Bool
)
import
Servant
import
Servant
...
@@ -68,7 +68,7 @@ newtype MakeSubcorpusAPI mode = MakeSubcorpusAPI
...
@@ -68,7 +68,7 @@ newtype MakeSubcorpusAPI mode = MakeSubcorpusAPI
:>
Capture
"corpus_id"
CorpusId
:>
Capture
"corpus_id"
CorpusId
:>
"subcorpus"
:>
"subcorpus"
:>
ReqBody
'[
J
SON
]
SubcorpusParams
:>
ReqBody
'[
J
SON
]
SubcorpusParams
:>
Post
'[
J
SON
]
Bool
-- was request successful
:>
Post
'[
J
SON
]
NodeId
-- new subcorpus ID
}
deriving
Generic
}
deriving
Generic
data
SubcorpusParams
=
SubcorpusParams
data
SubcorpusParams
=
SubcorpusParams
...
...
src/Gargantext/API/Routes/Named/Node.hs
View file @
b3a89d3a
...
@@ -60,6 +60,7 @@ import Gargantext.Database.Admin.Types.Hyperdata.User ( HyperdataUser )
...
@@ -60,6 +60,7 @@ import Gargantext.Database.Admin.Types.Hyperdata.User ( HyperdataUser )
import
Gargantext.Database.Query.Facet.Types
(
FacetDoc
,
OrderBy
(
..
)
)
import
Gargantext.Database.Query.Facet.Types
(
FacetDoc
,
OrderBy
(
..
)
)
import
Prelude
import
Prelude
import
Servant
import
Servant
import
Gargantext.API.Routes.Named.Remote
(
RemoteExportAPI
)
-------------------------------------------------------------------
-------------------------------------------------------------------
-- | Node API Types management
-- | Node API Types management
...
@@ -109,6 +110,7 @@ data NodeAPI a mode = NodeAPI
...
@@ -109,6 +110,7 @@ data NodeAPI a mode = NodeAPI
,
fileAsyncAPI
::
mode
:-
"async"
:>
NamedRoutes
FileAsyncAPI
,
fileAsyncAPI
::
mode
:-
"async"
:>
NamedRoutes
FileAsyncAPI
,
dfwnAPI
::
mode
:-
"documents-from-write-nodes"
:>
NamedRoutes
DocumentsFromWriteNodesAPI
,
dfwnAPI
::
mode
:-
"documents-from-write-nodes"
:>
NamedRoutes
DocumentsFromWriteNodesAPI
,
documentUploadAPI
::
mode
:-
NamedRoutes
DocumentUploadAPI
,
documentUploadAPI
::
mode
:-
NamedRoutes
DocumentUploadAPI
,
remoteExportAPI
::
mode
:-
NamedRoutes
RemoteExportAPI
}
deriving
Generic
}
deriving
Generic
...
...
src/Gargantext/API/Routes/Named/Private.hs
View file @
b3a89d3a
...
@@ -25,7 +25,6 @@ module Gargantext.API.Routes.Named.Private (
...
@@ -25,7 +25,6 @@ module Gargantext.API.Routes.Named.Private (
import
Data.Text
(
Text
)
import
Data.Text
(
Text
)
import
GHC.Generics
(
Generic
)
import
Gargantext.API.Admin.Auth.Types
(
AuthenticatedUser
)
import
Gargantext.API.Admin.Auth.Types
(
AuthenticatedUser
)
import
Gargantext.API.Auth.PolicyCheck
(
PolicyChecked
)
import
Gargantext.API.Auth.PolicyCheck
(
PolicyChecked
)
import
Gargantext.API.Routes.Named.Contact
(
ContactAPI
)
import
Gargantext.API.Routes.Named.Contact
(
ContactAPI
)
...
@@ -34,14 +33,16 @@ import Gargantext.API.Routes.Named.Corpus (AddWithTempFile, AddWithQuery, Corpus
...
@@ -34,14 +33,16 @@ import Gargantext.API.Routes.Named.Corpus (AddWithTempFile, AddWithQuery, Corpus
import
Gargantext.API.Routes.Named.Count
(
CountAPI
,
Query
)
import
Gargantext.API.Routes.Named.Count
(
CountAPI
,
Query
)
import
Gargantext.API.Routes.Named.Document
(
DocumentExportAPI
)
import
Gargantext.API.Routes.Named.Document
(
DocumentExportAPI
)
import
Gargantext.API.Routes.Named.List
(
GETAPI
,
JSONAPI
,
TSVAPI
)
import
Gargantext.API.Routes.Named.List
(
GETAPI
,
JSONAPI
,
TSVAPI
)
import
Gargantext.API.Routes.Named.Node
(
NodeAPI
,
NodesAPI
,
NodeNodeAPI
,
Roots
)
import
Gargantext.API.Routes.Named.Node
import
Gargantext.API.Routes.Named.Share
(
ShareURL
)
import
Gargantext.API.Routes.Named.Remote
import
Gargantext.API.Routes.Named.Table
(
TableNgramsAPI
)
import
Gargantext.API.Routes.Named.Share
import
Gargantext.API.Routes.Named.Tree
(
NodeTreeAPI
,
TreeFlatAPI
)
import
Gargantext.API.Routes.Named.Table
import
Gargantext.API.Routes.Named.Viz
(
GraphAPI
,
PhyloExportAPI
)
import
Gargantext.API.Routes.Named.Tree
import
Gargantext.Database.Admin.Types.Hyperdata.Any
(
HyperdataAny
)
import
Gargantext.API.Routes.Named.Viz
import
Gargantext.Database.Admin.Types.Hyperdata.Corpus
(
HyperdataAnnuaire
,
HyperdataCorpus
)
import
Gargantext.Database.Admin.Types.Hyperdata.Any
import
Gargantext.Database.Admin.Types.Node
(
ContextId
,
CorpusId
,
DocId
,
NodeId
)
import
Gargantext.Database.Admin.Types.Hyperdata.Corpus
import
Gargantext.Database.Admin.Types.Node
import
GHC.Generics
import
Servant.API
import
Servant.API
import
Servant.Auth
qualified
as
SA
import
Servant.Auth
qualified
as
SA
...
@@ -119,6 +120,7 @@ data NodeAPIEndpoint mode = NodeAPIEndpoint
...
@@ -119,6 +120,7 @@ data NodeAPIEndpoint mode = NodeAPIEndpoint
:>
Summary
"Node endpoint"
:>
Summary
"Node endpoint"
:>
Capture
"node_id"
NodeId
:>
Capture
"node_id"
NodeId
:>
NamedRoutes
(
NodeAPI
HyperdataAny
)
:>
NamedRoutes
(
NodeAPI
HyperdataAny
)
,
nodeRemoteImportAPI
::
mode
:-
"node"
:>
"remote"
:>
NamedRoutes
RemoteImportAPI
}
deriving
Generic
}
deriving
Generic
newtype
AnnuaireAPIEndpoint
mode
=
AnnuaireAPIEndpoint
newtype
AnnuaireAPIEndpoint
mode
=
AnnuaireAPIEndpoint
...
...
src/Gargantext/API/Routes/Named/Remote.hs
0 → 100644
View file @
b3a89d3a
{-# LANGUAGE TypeOperators #-}
{-# LANGUAGE DerivingStrategies #-}
module
Gargantext.API.Routes.Named.Remote
(
-- * Routes types
RemoteExportAPI
(
..
)
,
RemoteImportAPI
(
..
)
,
RemoteExportRequest
(
..
)
,
RemoteBinaryData
(
..
)
)
where
import
Conduit
qualified
as
C
import
Data.Aeson
as
JSON
import
Data.ByteString.Lazy
qualified
as
BL
import
Data.ByteString
qualified
as
BS
import
Data.Proxy
import
Data.Swagger
hiding
(
Http
)
import
Gargantext.API.Admin.Auth.Types
(
Token
)
import
Gargantext.API.Auth.PolicyCheck
(
PolicyChecked
)
import
Gargantext.Database.Admin.Types.Node
(
NodeId
(
..
)
)
import
GHC.Generics
import
Prelude
import
Servant.API
import
Servant.Client.Core.BaseUrl
import
Test.QuickCheck
data
RemoteExportAPI
mode
=
RemoteExportAPI
{
remoteExportEp
::
mode
:-
"remote"
:>
ReqBody
'[
J
SON
]
RemoteExportRequest
:>
PolicyChecked
(
Post
'[
J
SON
]
[
NodeId
])
}
deriving
Generic
data
RemoteImportAPI
mode
=
RemoteImportAPI
{
remoteImportEp
::
mode
:-
"import"
:>
StreamBody
NoFraming
OctetStream
(
C
.
ConduitT
()
RemoteBinaryData
IO
()
)
:>
Post
'[
J
SON
]
[
NodeId
]
}
deriving
Generic
data
RemoteExportRequest
=
RemoteExportRequest
{
-- | The URL of the instance we want to copy data to.
_rer_instance_url
::
BaseUrl
-- | The JWT token to use for authentication purposes.
,
_rer_instance_auth
::
Token
}
deriving
(
Show
,
Eq
,
Generic
)
instance
Arbitrary
RemoteExportRequest
where
arbitrary
=
RemoteExportRequest
<$>
(
pure
(
BaseUrl
Http
"dev.sub.gargantext.org"
8008
""
))
<*>
arbitrary
instance
ToJSON
RemoteExportRequest
where
toJSON
RemoteExportRequest
{
..
}
=
JSON
.
object
[
"instance_url"
.=
toJSON
_rer_instance_url
,
"instance_auth"
.=
toJSON
_rer_instance_auth
]
instance
FromJSON
RemoteExportRequest
where
parseJSON
=
withObject
"RemoteExportRequest"
$
\
o
->
do
_rer_instance_url
<-
maybe
(
fail
"RemoteExportRequest invalid URL"
)
pure
=<<
(
parseBaseUrl
<$>
o
.:
"instance_url"
)
_rer_instance_auth
<-
o
.:
"instance_auth"
pure
RemoteExportRequest
{
..
}
instance
ToSchema
RemoteExportRequest
where
declareNamedSchema
_
=
let
exampleSchema
=
RemoteExportRequest
(
BaseUrl
Http
"dev.sub.gargantext.org"
8008
""
)
(
"abcdef"
)
in
pure
$
NamedSchema
(
Just
"RemoteExportRequest"
)
$
sketchStrictSchema
exampleSchema
newtype
RemoteBinaryData
=
RemoteBinaryData
{
getRemoteBinaryData
::
BS
.
ByteString
}
deriving
(
Show
,
Eq
,
Ord
)
instance
Accept
RemoteBinaryData
where
contentType
_
=
contentType
(
Proxy
::
Proxy
OctetStream
)
instance
MimeRender
OctetStream
RemoteBinaryData
where
mimeRender
_
(
RemoteBinaryData
bs
)
=
BL
.
fromStrict
bs
instance
MimeUnrender
OctetStream
RemoteBinaryData
where
mimeUnrender
_
bs
=
Right
(
RemoteBinaryData
$
BS
.
toStrict
bs
)
instance
ToSchema
RemoteBinaryData
where
declareNamedSchema
_
=
pure
$
NamedSchema
(
Just
"RemoteExportRequest"
)
binarySchema
src/Gargantext/API/Routes/Named/Share.hs
View file @
b3a89d3a
...
@@ -13,14 +13,14 @@ module Gargantext.API.Routes.Named.Share (
...
@@ -13,14 +13,14 @@ module Gargantext.API.Routes.Named.Share (
,
ShareNodeParams
(
..
)
,
ShareNodeParams
(
..
)
)
where
)
where
import
Data.Aeson
(
FromJSON
(
..
),
ToJSON
(
..
),
withText
)
import
Data.Aeson
(
withText
)
import
Data.Swagger
(
ToSchema
,
declareNamedSchema
)
import
Data.Swagger
(
ToSchema
,
declareNamedSchema
)
import
Data.Text
qualified
as
T
import
Data.Text
qualified
as
T
import
GHC.Generics
(
Generic
)
import
Gargantext.API.Node.Share.Types
(
ShareNodeParams
(
..
)
)
import
Gargantext.API.Node.Share.Types
(
ShareNodeParams
(
..
)
)
import
Gargantext.Database.Admin.Types.Node
(
NodeId
,
NodeType
)
import
Gargantext.Database.Admin.Types.Node
import
Gargantext.Prelude
import
Network.URI
(
parseURI
)
import
Network.URI
(
parseURI
)
import
Prelude
import
Prelude
(
fail
)
import
Servant
import
Servant
-- | A shareable link.
-- | A shareable link.
...
@@ -31,6 +31,8 @@ import Servant
...
@@ -31,6 +31,8 @@ import Servant
newtype
ShareLink
=
ShareLink
{
getShareLink
::
URI
}
newtype
ShareLink
=
ShareLink
{
getShareLink
::
URI
}
deriving
(
Show
,
Eq
,
Ord
,
Generic
)
deriving
(
Show
,
Eq
,
Ord
,
Generic
)
instance
NFData
ShareLink
where
renderShareLink
::
ShareLink
->
T
.
Text
renderShareLink
::
ShareLink
->
T
.
Text
renderShareLink
=
T
.
pack
.
show
.
getShareLink
renderShareLink
=
T
.
pack
.
show
.
getShareLink
...
...
src/Gargantext/API/Server/Named/Private.hs
View file @
b3a89d3a
...
@@ -10,17 +10,17 @@ import Gargantext.API.Errors.Types (BackendInternalError)
...
@@ -10,17 +10,17 @@ import Gargantext.API.Errors.Types (BackendInternalError)
import
Gargantext.API.Members
(
members
)
import
Gargantext.API.Members
(
members
)
import
Gargantext.API.Ngrams.List
qualified
as
List
import
Gargantext.API.Ngrams.List
qualified
as
List
import
Gargantext.API.Node
(
annuaireNodeAPI
,
corpusNodeAPI
,
nodeAPI
,
nodeNodeAPI
,
nodesAPI
,
roots
)
import
Gargantext.API.Node
(
annuaireNodeAPI
,
corpusNodeAPI
,
nodeAPI
,
nodeNodeAPI
,
nodesAPI
,
roots
)
import
Gargantext.API.Node
qualified
as
Tree
import
Gargantext.API.Node.Contact
as
Contact
import
Gargantext.API.Node.Contact
as
Contact
import
Gargantext.API.Node.Corpus.Export
qualified
as
CorpusExport
import
Gargantext.API.Node.Corpus.Export
qualified
as
CorpusExport
import
Gargantext.API.Node.Corpus.Subcorpus
qualified
as
Subcorpus
import
Gargantext.API.Node.Corpus.Subcorpus
qualified
as
Subcorpus
import
Gargantext.API.Node.Document.Export
(
documentExportAPI
)
import
Gargantext.API.Node.Document.Export
(
documentExportAPI
)
import
Gargantext.API.Node.Phylo.Export
qualified
as
PhyloExport
import
Gargantext.API.Node.Phylo.Export
qualified
as
PhyloExport
import
Gargantext.API.Node
qualified
as
Tree
import
Gargantext.API.Node.ShareURL
(
shareURL
)
import
Gargantext.API.Node.ShareURL
(
shareURL
)
import
Gargantext.API.Prelude
(
GargM
)
import
Gargantext.API.Prelude
(
GargM
)
import
Gargantext.API.Routes
(
addWithTempFileApi
,
addCorpusWithQuery
)
import
Gargantext.API.Routes
(
addWithTempFileApi
,
addCorpusWithQuery
)
import
Gargantext.API.Routes.Named.Private
qualified
as
Named
import
Gargantext.API.Routes.Named.Private
qualified
as
Named
import
Gargantext.API.Server.Named.Ngrams
(
apiNgramsTableDoc
)
import
Gargantext.API.Server.Named.Ngrams
import
Gargantext.API.Server.Named.Viz
qualified
as
Viz
import
Gargantext.API.Server.Named.Viz
qualified
as
Viz
import
Gargantext.Core.Types.Individu
(
User
(
..
))
import
Gargantext.Core.Types.Individu
(
User
(
..
))
import
Gargantext.Database.Admin.Types.Hyperdata
(
HyperdataAny
)
import
Gargantext.Database.Admin.Types.Hyperdata
(
HyperdataAny
)
...
...
src/Gargantext/API/Server/Named/Remote.hs
0 → 100644
View file @
b3a89d3a
This diff is collapsed.
Click to expand it.
src/Gargantext/API/ThrowAll.hs
View file @
b3a89d3a
...
@@ -30,6 +30,7 @@ import Gargantext.API.Admin.EnvTypes (Env)
...
@@ -30,6 +30,7 @@ import Gargantext.API.Admin.EnvTypes (Env)
import
Gargantext.API.Errors.Types
(
BackendInternalError
)
import
Gargantext.API.Errors.Types
(
BackendInternalError
)
import
Gargantext.API.Prelude
(
GargM
,
_ServerError
)
import
Gargantext.API.Prelude
(
GargM
,
_ServerError
)
import
Gargantext.API.Routes.Named.Private
qualified
as
Named
import
Gargantext.API.Routes.Named.Private
qualified
as
Named
import
Gargantext.API.Routes.Named.Remote
()
-- instance MimeUnrenderer
import
Gargantext.API.Server.Named.Private
qualified
as
Named
import
Gargantext.API.Server.Named.Private
qualified
as
Named
import
Gargantext.Database.Admin.Types.Node
(
UserId
(
..
))
import
Gargantext.Database.Admin.Types.Node
(
UserId
(
..
))
import
Gargantext.Prelude
hiding
(
Handler
)
import
Gargantext.Prelude
hiding
(
Handler
)
...
@@ -37,6 +38,7 @@ import Network.HTTP.Types.Status (Status(..))
...
@@ -37,6 +38,7 @@ import Network.HTTP.Types.Status (Status(..))
import
Network.Wai
(
responseLBS
)
import
Network.Wai
(
responseLBS
)
import
Servant
import
Servant
import
Servant.Auth.Server
(
AuthResult
(
..
))
import
Servant.Auth.Server
(
AuthResult
(
..
))
import
Servant.Conduit
()
import
Servant.Server.Generic
(
AsServerT
)
import
Servant.Server.Generic
(
AsServerT
)
-- | Slightly more general version of the 'ThrowAll' typeclass from Servant,
-- | Slightly more general version of the 'ThrowAll' typeclass from Servant,
...
...
src/Gargantext/Core.hs
View file @
b3a89d3a
...
@@ -57,6 +57,8 @@ data Lang = DE
...
@@ -57,6 +57,8 @@ data Lang = DE
|
ZH
|
ZH
deriving
(
Read
,
Show
,
Eq
,
Ord
,
Enum
,
Bounded
,
Generic
,
GQLType
)
deriving
(
Read
,
Show
,
Eq
,
Ord
,
Enum
,
Bounded
,
Generic
,
GQLType
)
instance
NFData
Lang
where
-- | Defaults to 'EN' in all those places where a language is mandatory,
-- | Defaults to 'EN' in all those places where a language is mandatory,
-- but an optional one has been passed.
-- but an optional one has been passed.
withDefaultLanguage
::
Maybe
Lang
->
Lang
withDefaultLanguage
::
Maybe
Lang
->
Lang
...
...
src/Gargantext/Core/Config.hs
View file @
b3a89d3a
...
@@ -35,6 +35,7 @@ module Gargantext.Core.Config (
...
@@ -35,6 +35,7 @@ module Gargantext.Core.Config (
,
HasJWTSettings
(
..
)
,
HasJWTSettings
(
..
)
,
HasConfig
(
..
)
,
HasConfig
(
..
)
,
HasManager
(
..
)
)
where
)
where
import
Control.Lens
(
Getter
)
import
Control.Lens
(
Getter
)
...
@@ -46,6 +47,7 @@ import Gargantext.Core.Config.NLP (NLPConfig)
...
@@ -46,6 +47,7 @@ import Gargantext.Core.Config.NLP (NLPConfig)
import
Gargantext.Core.Config.Worker
(
WorkerSettings
)
import
Gargantext.Core.Config.Worker
(
WorkerSettings
)
import
Gargantext.Core.Config.Types
import
Gargantext.Core.Config.Types
import
Gargantext.Prelude
import
Gargantext.Prelude
import
Network.HTTP.Client
qualified
as
HTTP
import
Servant.Auth.Server
(
JWTSettings
)
import
Servant.Auth.Server
(
JWTSettings
)
import
Servant.Client
(
BaseUrl
(
..
),
Scheme
(
Http
),
parseBaseUrl
)
import
Servant.Client
(
BaseUrl
(
..
),
Scheme
(
Http
),
parseBaseUrl
)
import
Toml.Schema
import
Toml.Schema
...
@@ -134,3 +136,6 @@ instance HasConfig GargConfig where
...
@@ -134,3 +136,6 @@ instance HasConfig GargConfig where
class
HasJWTSettings
env
where
class
HasJWTSettings
env
where
jwtSettings
::
Getter
env
JWTSettings
jwtSettings
::
Getter
env
JWTSettings
class
HasManager
env
where
gargHttpManager
::
Getter
env
HTTP
.
Manager
src/Gargantext/Core/Text/Ngrams.hs
View file @
b3a89d3a
...
@@ -88,6 +88,7 @@ data Ngrams = UnsafeNgrams { _ngramsTerms :: Text
...
@@ -88,6 +88,7 @@ data Ngrams = UnsafeNgrams { _ngramsTerms :: Text
deriving
(
Generic
,
Show
,
Eq
,
Ord
)
deriving
(
Generic
,
Show
,
Eq
,
Ord
)
instance
Hashable
Ngrams
instance
Hashable
Ngrams
instance
Serialise
Ngrams
where
makeLenses
''
N
grams
makeLenses
''
N
grams
instance
PGS
.
ToRow
Ngrams
where
instance
PGS
.
ToRow
Ngrams
where
...
...
src/Gargantext/Core/Types.hs
View file @
b3a89d3a
...
@@ -188,6 +188,8 @@ $(deriveJSON (unPrefix "tr_") ''TableResult)
...
@@ -188,6 +188,8 @@ $(deriveJSON (unPrefix "tr_") ''TableResult)
instance
(
Typeable
a
,
ToSchema
a
)
=>
ToSchema
(
TableResult
a
)
where
instance
(
Typeable
a
,
ToSchema
a
)
=>
ToSchema
(
TableResult
a
)
where
declareNamedSchema
=
wellNamedSchema
"tr_"
declareNamedSchema
=
wellNamedSchema
"tr_"
instance
NFData
a
=>
NFData
(
TableResult
a
)
where
----------------------------------------------------------------------------
----------------------------------------------------------------------------
data
Typed
a
b
=
data
Typed
a
b
=
Typed
{
_withType
::
a
Typed
{
_withType
::
a
...
...
src/Gargantext/Core/Types/Individu.hs
View file @
b3a89d3a
...
@@ -57,7 +57,9 @@ instance Prelude.Show GargPassword where
...
@@ -57,7 +57,9 @@ instance Prelude.Show GargPassword where
instance
ToJSON
GargPassword
instance
ToJSON
GargPassword
instance
FromJSON
GargPassword
instance
FromJSON
GargPassword
instance
ToSchema
GargPassword
instance
ToSchema
GargPassword
where
declareNamedSchema
_
=
pure
$
NamedSchema
(
Just
"GargPassword"
)
passwordSchema
type
Email
=
Text
type
Email
=
Text
type
UsernameMaster
=
Username
type
UsernameMaster
=
Username
type
UsernameSimple
=
Username
type
UsernameSimple
=
Username
...
...
src/Gargantext/Core/Types/Main.hs
View file @
b3a89d3a
...
@@ -13,11 +13,13 @@ Portability : POSIX
...
@@ -13,11 +13,13 @@ Portability : POSIX
{-# LANGUAGE TemplateHaskell #-}
{-# LANGUAGE TemplateHaskell #-}
{-# LANGUAGE DeriveAnyClass #-}
{-# LANGUAGE DeriveAnyClass #-}
{-# LANGUAGE BangPatterns #-}
-----------------------------------------------------------------------
-----------------------------------------------------------------------
module
Gargantext.Core.Types.Main
where
module
Gargantext.Core.Types.Main
where
------------------------------------------------------------------------
------------------------------------------------------------------------
import
Codec.Serialise.Class
import
Data.Bimap
(
Bimap
)
import
Data.Bimap
(
Bimap
)
import
Data.Bimap
qualified
as
Bimap
import
Data.Bimap
qualified
as
Bimap
import
Data.Swagger
(
ToSchema
(
..
),
ToParamSchema
,
genericDeclareNamedSchema
)
import
Data.Swagger
(
ToSchema
(
..
),
ToParamSchema
,
genericDeclareNamedSchema
)
...
@@ -29,8 +31,8 @@ import Gargantext.Core.Utils.Swagger (wellNamedSchema)
...
@@ -29,8 +31,8 @@ import Gargantext.Core.Utils.Swagger (wellNamedSchema)
import
Gargantext.Database.Admin.Types.Node
-- (NodeType(..), Node, Hyperdata(..))
import
Gargantext.Database.Admin.Types.Node
-- (NodeType(..), Node, Hyperdata(..))
import
Gargantext.Prelude
import
Gargantext.Prelude
import
Servant.API
(
FromHttpApiData
(
..
),
ToHttpApiData
(
..
))
import
Servant.API
(
FromHttpApiData
(
..
),
ToHttpApiData
(
..
))
import
Test.QuickCheck
(
elements
)
import
Test.QuickCheck.Arbitrary
(
Arbitrary
,
arbitrary
)
import
Test.QuickCheck.Arbitrary
(
Arbitrary
,
arbitrary
)
import
Test.QuickCheck
(
elements
)
type
CorpusName
=
Text
type
CorpusName
=
Text
------------------------------------------------------------------------
------------------------------------------------------------------------
...
@@ -40,6 +42,8 @@ data NodeTree = NodeTree { _nt_name :: Text
...
@@ -40,6 +42,8 @@ data NodeTree = NodeTree { _nt_name :: Text
,
_nt_publish_policy
::
Maybe
NodePublishPolicy
,
_nt_publish_policy
::
Maybe
NodePublishPolicy
}
deriving
(
Show
,
Read
,
Generic
)
}
deriving
(
Show
,
Read
,
Generic
)
instance
NFData
NodeTree
where
instance
Eq
NodeTree
where
instance
Eq
NodeTree
where
(
==
)
d1
d2
=
_nt_id
d1
==
_nt_id
d2
(
==
)
d1
d2
=
_nt_id
d1
==
_nt_id
d2
...
@@ -56,6 +60,7 @@ type TypeId = Int
...
@@ -56,6 +60,7 @@ type TypeId = Int
data
ListType
=
CandidateTerm
|
StopTerm
|
MapTerm
data
ListType
=
CandidateTerm
|
StopTerm
|
MapTerm
deriving
(
Generic
,
Eq
,
Ord
,
Show
,
Read
,
Enum
,
Bounded
,
ToExpr
)
deriving
(
Generic
,
Eq
,
Ord
,
Show
,
Read
,
Enum
,
Bounded
,
ToExpr
)
instance
NFData
ListType
where
instance
ToJSON
ListType
instance
ToJSON
ListType
instance
FromJSON
ListType
instance
FromJSON
ListType
instance
ToSchema
ListType
instance
ToSchema
ListType
...
@@ -115,6 +120,44 @@ fromListTypeId = flip Bimap.lookupR listTypeIds
...
@@ -115,6 +120,44 @@ fromListTypeId = flip Bimap.lookupR listTypeIds
data
Tree
a
=
TreeN
{
_tn_node
::
a
,
_tn_children
::
[
Tree
a
]
}
data
Tree
a
=
TreeN
{
_tn_node
::
a
,
_tn_children
::
[
Tree
a
]
}
deriving
(
Show
,
Read
,
Eq
,
Generic
,
Ord
)
deriving
(
Show
,
Read
,
Eq
,
Generic
,
Ord
)
instance
Serialise
a
=>
Serialise
(
Tree
a
)
where
instance
NFData
a
=>
NFData
(
Tree
a
)
where
instance
Functor
Tree
where
fmap
=
fmapTree
x
<$
TreeN
_
ts
=
TreeN
x
(
map
(
x
<$
)
ts
)
fmapTree
::
(
a
->
b
)
->
Tree
a
->
Tree
b
fmapTree
f
(
TreeN
x
ts
)
=
TreeN
(
f
x
)
(
map
(
fmapTree
f
)
ts
)
instance
Traversable
Tree
where
traverse
f
=
go
where
go
(
TreeN
x
ts
)
=
liftA2
TreeN
(
f
x
)
(
traverse
go
ts
)
{-# INLINE traverse #-}
instance
Foldable
Tree
where
fold
=
foldMap
identity
{-# INLINABLE fold #-}
foldMap
=
foldMapDefault
{-# INLINE foldMap #-}
foldr
f
z
=
\
t
->
go
t
z
-- Use a lambda to allow inlining with two arguments
where
go
(
TreeN
x
ts
)
=
f
x
.
foldr
(
\
t
k
->
go
t
.
k
)
identity
ts
{-# INLINE foldr #-}
foldl'
f
=
go
where
go
!
z
(
TreeN
x
ts
)
=
foldl'
go
(
f
z
x
)
ts
{-# INLINE foldl' #-}
null
_
=
False
{-# INLINE null #-}
elem
=
any
.
(
==
)
{-# INLINABLE elem #-}
$
(
deriveJSON
(
unPrefix
"_tn_"
)
''
T
ree
)
$
(
deriveJSON
(
unPrefix
"_tn_"
)
''
T
ree
)
instance
(
Typeable
a
,
ToSchema
a
)
=>
ToSchema
(
Tree
a
)
where
instance
(
Typeable
a
,
ToSchema
a
)
=>
ToSchema
(
Tree
a
)
where
...
...
src/Gargantext/Core/Viz/Graph/Bridgeness.hs
View file @
b3a89d3a
...
@@ -45,42 +45,6 @@ nodeId2comId (ClusterNode i1 i2) = (i1, i2)
...
@@ -45,42 +45,6 @@ nodeId2comId (ClusterNode i1 i2) = (i1, i2)
type
NodeId
=
Int
type
NodeId
=
Int
type
CommunityId
=
Int
type
CommunityId
=
Int
----------------------------------------------------------------------
-- recursiveClustering : to get more granularity of a given clustering
-- tested with spinglass clustering only (WIP)
recursiveClustering'
::
Partitions'
->
Map
(
Int
,
Int
)
Double
->
IO
[[
Set
NodeId
]]
recursiveClustering'
f
mp
=
do
let
n
::
Double
n
=
fromIntegral
$
Set
.
size
$
Set
.
unions
$
List
.
concat
$
map
(
\
(
k1
,
k2
)
->
map
Set
.
singleton
[
k1
,
k2
])
$
Map
.
keys
mp
t
::
Int
t
=
round
$
2
*
n
/
sqrt
n
ss
<-
f
mp
mapM
(
\
s
->
if
Set
.
size
s
>
t
then
f
(
removeNodes
s
mp
)
else
pure
[
s
])
ss
----------------------------------------------------------------------
recursiveClustering
::
Partitions
->
Map
(
Int
,
Int
)
Double
->
IO
[
ClusterNode
]
recursiveClustering
f
mp
=
do
let
n
::
Double
n
=
fromIntegral
$
Set
.
size
$
Set
.
unions
$
List
.
concat
$
map
(
\
(
k1
,
k2
)
->
map
Set
.
singleton
[
k1
,
k2
])
$
Map
.
keys
mp
t
::
Int
t
=
round
$
2
*
n
/
sqrt
n
(
toSplit
,
others
)
<-
List
.
span
(
\
a
->
Set
.
size
a
>
t
)
<$>
clusterNodes2sets
<$>
f
mp
cls'
<-
mapM
f
$
map
(
\
s
->
removeNodes
s
mp
)
toSplit
pure
$
setNodes2clusterNodes
$
others
<>
(
List
.
concat
$
map
clusterNodes2sets
cls'
)
----------------------------------------------------------------------
----------------------------------------------------------------------
setNodes2clusterNodes
::
[
Set
NodeId
]
->
[
ClusterNode
]
setNodes2clusterNodes
::
[
Set
NodeId
]
->
[
ClusterNode
]
setNodes2clusterNodes
ns
=
List
.
concat
$
map
(
\
(
n
,
ns'
)
->
toCluster
n
ns'
)
$
zip
[
1
..
]
ns
setNodes2clusterNodes
ns
=
List
.
concat
$
map
(
\
(
n
,
ns'
)
->
toCluster
n
ns'
)
$
zip
[
1
..
]
ns
...
...
src/Gargantext/Core/Viz/Graph/GEXF.hs
View file @
b3a89d3a
...
@@ -51,19 +51,30 @@ graphToXML (G.Graph { .. }) = root _graph_nodes _graph_edges
...
@@ -51,19 +51,30 @@ graphToXML (G.Graph { .. }) = root _graph_nodes _graph_edges
desc
=
XML
.
tag
"description"
mempty
$
XML
.
content
"Gargantext gexf file"
desc
=
XML
.
tag
"description"
mempty
$
XML
.
content
"Gargantext gexf file"
graph
::
(
Monad
m
)
=>
[
G
.
Node
]
->
[
G
.
Edge
]
->
ConduitT
i
XML
.
Event
m
()
graph
::
(
Monad
m
)
=>
[
G
.
Node
]
->
[
G
.
Edge
]
->
ConduitT
i
XML
.
Event
m
()
graph
gn
ge
=
XML
.
tag
"graph"
params
$
(
nodes
gn
)
<>
(
edges
ge
)
graph
gn
ge
=
XML
.
tag
"graph"
params
$
graphAttributes
<>
(
nodes
gn
)
<>
(
edges
ge
)
where
where
params
=
XML
.
attr
"mode"
"static"
params
=
XML
.
attr
"mode"
"static"
<>
XML
.
attr
"defaultedgetype"
"directed"
<>
XML
.
attr
"defaultedgetype"
"directed"
graphAttributes
::
(
Monad
m
)
=>
ConduitT
i
XML
.
Event
m
()
graphAttributes
=
XML
.
tag
"attributes"
graphAttributesParams
$
graphAttributeWeight
where
graphAttributesParams
=
XML
.
attr
"class"
"node"
graphAttributeWeight
=
XML
.
tag
"attribute"
attrWeightParams
$
XML
.
content
""
attrWeightParams
=
XML
.
attr
"id"
"0"
<>
XML
.
attr
"title"
"weight"
<>
XML
.
attr
"type"
"integer"
nodes
::
(
Monad
m
)
=>
[
G
.
Node
]
->
ConduitT
i
XML
.
Event
m
()
nodes
::
(
Monad
m
)
=>
[
G
.
Node
]
->
ConduitT
i
XML
.
Event
m
()
nodes
gn
=
XML
.
tag
"nodes"
mempty
(
yieldMany
gn
.|
awaitForever
node'
)
nodes
gn
=
XML
.
tag
"nodes"
mempty
(
yieldMany
gn
.|
awaitForever
node'
)
node'
::
(
Monad
m
)
=>
G
.
Node
->
ConduitT
i
XML
.
Event
m
()
node'
::
(
Monad
m
)
=>
G
.
Node
->
ConduitT
i
XML
.
Event
m
()
node'
(
G
.
Node
{
..
})
=
XML
.
tag
"node"
params
(
XML
.
tag
"viz:size"
sizeParams
$
XML
.
content
""
)
-- node' (G.Node { .. }) = XML.tag "node" params (XML.tag "viz:size" sizeParams $ XML.content "")
node'
(
G
.
Node
{
..
})
=
XML
.
tag
"node"
params
$
XML
.
tag
"attvalues"
mempty
$
XML
.
tag
"attvalue"
sizeParams
mempty
where
where
params
=
XML
.
attr
"id"
node_id
params
=
XML
.
attr
"id"
node_id
<>
XML
.
attr
"label"
node_label
<>
XML
.
attr
"label"
node_label
sizeParams
=
XML
.
attr
"value"
(
show
node_size
)
sizeParams
=
XML
.
attr
"for"
"0"
<>
XML
.
attr
"value"
(
show
node_size
)
edges
::
(
Monad
m
)
=>
[
G
.
Edge
]
->
ConduitT
i
XML
.
Event
m
()
edges
::
(
Monad
m
)
=>
[
G
.
Edge
]
->
ConduitT
i
XML
.
Event
m
()
edges
ge
=
XML
.
tag
"edges"
mempty
(
yieldMany
ge
.|
awaitForever
edge'
)
edges
ge
=
XML
.
tag
"edges"
mempty
(
yieldMany
ge
.|
awaitForever
edge'
)
...
...
src/Gargantext/Core/Viz/Graph/Tools.hs
View file @
b3a89d3a
...
@@ -30,7 +30,7 @@ import Gargantext.API.Ngrams.Types (NgramsTerm(..))
...
@@ -30,7 +30,7 @@ import Gargantext.API.Ngrams.Types (NgramsTerm(..))
import
Gargantext.Core.Methods.Similarities
(
Similarity
(
..
),
measure
)
import
Gargantext.Core.Methods.Similarities
(
Similarity
(
..
),
measure
)
import
Gargantext.Core.Statistics
(
pcaReduceTo
,
Dimension
(
Dimension
)
)
import
Gargantext.Core.Statistics
(
pcaReduceTo
,
Dimension
(
Dimension
)
)
import
Gargantext.Core.Text.Ngrams
(
NgramsType
(
..
))
import
Gargantext.Core.Text.Ngrams
(
NgramsType
(
..
))
import
Gargantext.Core.Viz.Graph.Bridgeness
(
bridgeness
,
Bridgeness
(
..
),
Partitions
,
nodeId2comId
,
{-recursiveClustering,-}
recursiveClustering'
,
setNodes2clusterNodes
)
import
Gargantext.Core.Viz.Graph.Bridgeness
(
bridgeness
,
Bridgeness
(
..
),
Partitions
,
nodeId2comId
,
setNodes2clusterNodes
)
import
Gargantext.Core.Viz.Graph.Index
(
createIndices
,
toIndex
,
map2mat
,
mat2map
,
Index
,
MatrixShape
(
..
))
import
Gargantext.Core.Viz.Graph.Index
(
createIndices
,
toIndex
,
map2mat
,
mat2map
,
Index
,
MatrixShape
(
..
))
import
Gargantext.Core.Viz.Graph.Tools.IGraph
(
mkGraphUfromEdges
,
spinglass
,
spinglass'
)
import
Gargantext.Core.Viz.Graph.Tools.IGraph
(
mkGraphUfromEdges
,
spinglass
,
spinglass'
)
import
Gargantext.Core.Viz.Graph.Tools.Infomap
(
infomap
)
import
Gargantext.Core.Viz.Graph.Tools.Infomap
(
infomap
)
...
@@ -38,7 +38,7 @@ import Gargantext.Core.Viz.Graph.Types (Attributes(..), Edge(..), Graph(..), Mul
...
@@ -38,7 +38,7 @@ import Gargantext.Core.Viz.Graph.Types (Attributes(..), Edge(..), Graph(..), Mul
import
Gargantext.Core.Viz.Graph.Utils
(
edgesFilter
,
nodesFilter
)
import
Gargantext.Core.Viz.Graph.Utils
(
edgesFilter
,
nodesFilter
)
import
Gargantext.Prelude
import
Gargantext.Prelude
import
Graph.BAC.ProxemyOptim
qualified
as
BAC
import
Graph.BAC.ProxemyOptim
qualified
as
BAC
import
Graph.Types
(
ClusterNode
)
import
Graph.Types
(
ClusterNode
(
..
)
)
import
IGraph
qualified
as
Igraph
import
IGraph
qualified
as
Igraph
import
IGraph.Algorithms.Layout
qualified
as
Layout
import
IGraph.Algorithms.Layout
qualified
as
Layout
import
IGraph.Random
(
Gen
)
-- (Gen(..))
import
IGraph.Random
(
Gen
)
-- (Gen(..))
...
@@ -117,7 +117,7 @@ cooc2graphWith' _doPartitions _bridgenessMethod multi similarity threshold stren
...
@@ -117,7 +117,7 @@ cooc2graphWith' _doPartitions _bridgenessMethod multi similarity threshold stren
distanceMap
`
seq
`
diag
`
seq
`
ti
`
seq
`
pure
()
distanceMap
`
seq
`
diag
`
seq
`
ti
`
seq
`
pure
()
partitions
<-
if
(
Map
.
size
distanceMap
>
0
)
partitions
<-
if
(
Map
.
size
distanceMap
>
0
)
then
recursiveClustering'
(
spinglass'
1
)
distanceMap
then
spinglass'
1
distanceMap
else
panic
$
Text
.
unwords
[
"I can not compute the graph you request"
else
panic
$
Text
.
unwords
[
"I can not compute the graph you request"
,
"because either the quantity of documents"
,
"because either the quantity of documents"
,
"or the quantity of terms"
,
"or the quantity of terms"
...
@@ -130,30 +130,29 @@ cooc2graphWith' _doPartitions _bridgenessMethod multi similarity threshold stren
...
@@ -130,30 +130,29 @@ cooc2graphWith' _doPartitions _bridgenessMethod multi similarity threshold stren
let
let
!
confluence'
=
BAC
.
computeConfluences
3
(
Map
.
keys
distanceMap
)
True
!
confluence'
=
BAC
.
computeConfluences
3
(
Map
.
keys
distanceMap
)
True
!
bridgeness'
=
bridgeness
(
Bridgeness_Recursive
partitions
1.0
similarity
)
distanceMap
!
bridgeness'
=
bridgeness
(
Bridgeness_Basic
(
partitionsToClusterNodes
partitions
)
1.0
)
pure
$
data2graph
multi
ti
diag
bridgeness'
confluence'
(
setNodes2clusterNodes
$
List
.
concat
partitions
)
distanceMap
{-
pure
$
data2graph
multi
ti
diag
bridgeness'
confluence'
(
setNodes2clusterNodes
partitions
)
cooc2graphWith' _doPartitions _bridgenessMethod multi similarity@Distributional threshold strength myCooc = do
let (distanceMap, diag, ti) = doSimilarityMap similarity threshold strength myCooc
distanceMap `seq` diag `seq` ti `seq` pure ()
-- | A converter from the partition type returned by `spinglass'`
-- to the partition type required by `bridgeness`
partitions <- if (Map.size distanceMap > 0)
partitionsToClusterNodes
::
[
Set
Int
]
->
[
ClusterNode
]
then recursiveClustering (spinglass 1) distanceMap
partitionsToClusterNodes
setlist
=
else panic $ Text.unlines [ "[Gargantext.C.V.Graph.Tools] Similarity Matrix is empty"
setlist
&
, "Maybe you should add more Map Terms in your list"
-- Convert sets to lists:
, "Tutorial: TODO"
fmap
toList
&
]
-- Assign an integer index to each cluster:
length partitions `seq` pure ()
zip
[
1
..
]
&
-- Attach cluster IDs to individual nodes instead to whole clusters
let
fmap
(
\
(
id
,
clusterIds
)
->
zip
(
repeat
id
)
clusterIds
)
&
!confluence' = BAC.computeConfluences 3 (Map.keys distanceMap) True
-- Flatten list of clusters of nodes labeled by cluster indices
!bridgeness' = bridgeness (Bridgeness_Basic partitions 1.0) distanceMap
-- into a list of labeled nodes:
join
&
pure $ data2graph multi ti diag bridgeness' confluence' partitions
-- Turn pairs into `ClusterNode`s
-}
fmap
(
\
(
clusterId
,
nodeId
)
->
ClusterNode
nodeId
clusterId
)
type
Reverse
=
Bool
type
Reverse
=
Bool
...
...
src/Gargantext/Core/Worker.hs
View file @
b3a89d3a
...
@@ -30,15 +30,16 @@ import Database.PostgreSQL.Simple.LargeObjects qualified as PSQL
...
@@ -30,15 +30,16 @@ import Database.PostgreSQL.Simple.LargeObjects qualified as PSQL
import
Gargantext.API.Admin.Auth
(
forgotUserPassword
)
import
Gargantext.API.Admin.Auth
(
forgotUserPassword
)
import
Gargantext.API.Admin.Auth.Types
(
ForgotPasswordAsyncParams
(
..
))
import
Gargantext.API.Admin.Auth.Types
(
ForgotPasswordAsyncParams
(
..
))
import
Gargantext.API.Ngrams.List
(
postAsyncJSON
)
import
Gargantext.API.Ngrams.List
(
postAsyncJSON
)
import
Gargantext.API.Node.Corpus.Annuaire
qualified
as
Annuaire
import
Gargantext.API.Node.Contact
(
addContact
)
import
Gargantext.API.Node.Contact
(
addContact
)
import
Gargantext.API.Node.Corpus.Annuaire
qualified
as
Annuaire
import
Gargantext.API.Node.Corpus.New
(
addToCorpusWithTempFile
,
addToCorpusWithQuery
)
import
Gargantext.API.Node.Corpus.New
(
addToCorpusWithTempFile
,
addToCorpusWithQuery
)
import
Gargantext.API.Node.DocumentsFromWriteNodes
(
documentsFromWriteNodes
)
import
Gargantext.API.Node.DocumentsFromWriteNodes
(
documentsFromWriteNodes
)
import
Gargantext.API.Node.DocumentUpload
(
documentUploadAsync
)
import
Gargantext.API.Node.DocumentUpload
(
documentUploadAsync
,
remoteImportDocuments
)
import
Gargantext.API.Node.FrameCalcUpload
(
frameCalcUploadAsync
)
import
Gargantext.API.Node.File
(
addWithFile
)
import
Gargantext.API.Node.File
(
addWithFile
)
import
Gargantext.API.Node.FrameCalcUpload
(
frameCalcUploadAsync
)
import
Gargantext.API.Node.New
(
postNode'
)
import
Gargantext.API.Node.New
(
postNode'
)
import
Gargantext.API.Node.Types
(
_wtf_file_oid
)
import
Gargantext.API.Node.Types
(
_wtf_file_oid
)
import
Gargantext.API.Node.Update.Types
(
UpdateNodeParams
(
..
),
Granularity
(
..
))
import
Gargantext.API.Node.Update
(
updateNode
)
import
Gargantext.API.Node.Update
(
updateNode
)
import
Gargantext.API.Server.Named.Ngrams
(
tableNgramsPostChartsAsync
)
import
Gargantext.API.Server.Named.Ngrams
(
tableNgramsPostChartsAsync
)
import
Gargantext.Core.Config
(
hasConfig
,
gc_database_config
,
gc_jobs
,
gc_notifications_config
,
gc_worker
)
import
Gargantext.Core.Config
(
hasConfig
,
gc_database_config
,
gc_jobs
,
gc_notifications_config
,
gc_worker
)
...
@@ -49,8 +50,8 @@ import Gargantext.Core.Notifications.CentralExchange.Types qualified as CET
...
@@ -49,8 +50,8 @@ import Gargantext.Core.Notifications.CentralExchange.Types qualified as CET
import
Gargantext.Core.Viz.Graph.API
(
graphRecompute
)
import
Gargantext.Core.Viz.Graph.API
(
graphRecompute
)
import
Gargantext.Core.Worker.Broker
(
initBrokerWithDBCreate
)
import
Gargantext.Core.Worker.Broker
(
initBrokerWithDBCreate
)
import
Gargantext.Core.Worker.Env
import
Gargantext.Core.Worker.Env
import
Gargantext.Core.Worker.Jobs.Types
(
Job
(
..
),
getWorkerMNodeId
,
ImportRemoteDocumentsPayload
(
..
),
ImportRemoteTermsPayload
(
..
))
import
Gargantext.Core.Worker.PGMQTypes
(
BrokerMessage
,
HasWorkerBroker
,
WState
)
import
Gargantext.Core.Worker.PGMQTypes
(
BrokerMessage
,
HasWorkerBroker
,
WState
)
import
Gargantext.Core.Worker.Jobs.Types
(
Job
(
..
),
getWorkerMNodeId
)
import
Gargantext.Core.Worker.Types
(
JobInfo
(
..
))
import
Gargantext.Core.Worker.Types
(
JobInfo
(
..
))
import
Gargantext.Database.Prelude
(
readLargeObject
,
removeLargeObject
)
import
Gargantext.Database.Prelude
(
readLargeObject
,
removeLargeObject
)
import
Gargantext.Database.Query.Table.User
(
getUsersWithEmail
)
import
Gargantext.Database.Query.Table.User
(
getUsersWithEmail
)
...
@@ -307,3 +308,19 @@ performAction env _state bm = do
...
@@ -307,3 +308,19 @@ performAction env _state bm = do
UploadDocument
{
..
}
->
runWorkerMonad
env
$
do
UploadDocument
{
..
}
->
runWorkerMonad
env
$
do
$
(
logLocM
)
DEBUG
$
"[performAction] upload document"
$
(
logLocM
)
DEBUG
$
"[performAction] upload document"
void
$
documentUploadAsync
_ud_node_id
_ud_args
jh
void
$
documentUploadAsync
_ud_node_id
_ud_args
jh
-- | Remotely import documents
ImportRemoteTerms
(
ImportRemoteTermsPayload
list_id
ngrams_list
)
->
runWorkerMonad
env
$
do
$
(
logLocM
)
DEBUG
$
"[performAction] import remote terms"
void
$
postAsyncJSON
list_id
ngrams_list
jh
-- Trigger an 'UpdateNode' job to update the score(s)
$
(
logLocM
)
DEBUG
$
"Updating node scores for corpus node "
<>
T
.
pack
(
show
list_id
)
void
$
updateNode
list_id
(
UpdateNodeParamsTexts
Both
)
jh
$
(
logLocM
)
DEBUG
$
"Done updating node scores for corpus node "
<>
T
.
pack
(
show
list_id
)
-- | Remotely import documents
ImportRemoteDocuments
(
ImportRemoteDocumentsPayload
loggedInUser
parentId
corpusId
docs
workSplit
)
->
runWorkerMonad
env
$
do
$
(
logLocM
)
DEBUG
$
"[performAction] import remote documents"
void
$
remoteImportDocuments
loggedInUser
parentId
corpusId
workSplit
docs
src/Gargantext/Core/Worker/Jobs.hs
View file @
b3a89d3a
...
@@ -60,6 +60,8 @@ updateJobData (NgramsPostCharts {}) sj = sj { W.timeout = 3000 }
...
@@ -60,6 +60,8 @@ updateJobData (NgramsPostCharts {}) sj = sj { W.timeout = 3000 }
updateJobData
(
RecomputeGraph
{})
sj
=
sj
{
W
.
timeout
=
3000
}
updateJobData
(
RecomputeGraph
{})
sj
=
sj
{
W
.
timeout
=
3000
}
updateJobData
(
UpdateNode
{})
sj
=
sj
{
W
.
timeout
=
3000
}
updateJobData
(
UpdateNode
{})
sj
=
sj
{
W
.
timeout
=
3000
}
updateJobData
(
UploadDocument
{})
sj
=
sj
{
W
.
timeout
=
3000
}
updateJobData
(
UploadDocument
{})
sj
=
sj
{
W
.
timeout
=
3000
}
updateJobData
(
ImportRemoteDocuments
{})
sj
=
sj
{
W
.
timeout
=
3000
}
updateJobData
(
ImportRemoteTerms
{})
sj
=
sj
{
W
.
timeout
=
3000
}
-- | ForgotPasswordAsync, PostNodeAsync
-- | ForgotPasswordAsync, PostNodeAsync
updateJobData
_
sj
=
sj
{
W
.
resendOnKill
=
False
updateJobData
_
sj
=
sj
{
W
.
resendOnKill
=
False
,
W
.
timeout
=
60
}
,
W
.
timeout
=
60
}
src/Gargantext/Core/Worker/Jobs/Types.hs
View file @
b3a89d3a
...
@@ -9,26 +9,92 @@ Portability : POSIX
...
@@ -9,26 +9,92 @@ Portability : POSIX
-}
-}
{-# LANGUAGE TypeApplications #-}
module
Gargantext.Core.Worker.Jobs.Types
where
module
Gargantext.Core.Worker.Jobs.Types
where
import
Data.Aeson
((
.:
),
(
.=
),
object
,
withObject
)
import
Data.Aeson
((
.:
),
(
.=
),
object
,
withObject
)
import
Data.Aeson.Types
(
prependFailure
,
typeMismatch
)
import
Data.Aeson.Types
(
prependFailure
,
typeMismatch
)
import
Data.Aeson
qualified
as
JS
import
Data.Aeson.KeyMap
qualified
as
KM
import
Data.Text
qualified
as
T
import
Gargantext.API.Admin.Auth.Types
(
AuthenticatedUser
,
ForgotPasswordAsyncParams
)
import
Gargantext.API.Admin.Auth.Types
(
AuthenticatedUser
,
ForgotPasswordAsyncParams
)
import
Gargantext.API.Ngrams.Types
(
UpdateTableNgramsCharts
(
_utn_list_id
))
import
Gargantext.API.Ngrams.Types
(
NgramsList
,
UpdateTableNgramsCharts
(
_utn_list_id
))
import
Gargantext.API.Node.Contact.Types
(
AddContactParams
)
import
Gargantext.API.Node.Contact.Types
(
AddContactParams
)
import
Gargantext.API.Node.Corpus.Annuaire
(
AnnuaireWithForm
)
import
Gargantext.API.Node.Corpus.Annuaire
(
AnnuaireWithForm
)
import
Gargantext.API.Node.Document.Export.Types
(
Document
)
import
Gargantext.API.Node.DocumentsFromWriteNodes.Types
qualified
as
DFWN
import
Gargantext.API.Node.DocumentsFromWriteNodes.Types
qualified
as
DFWN
import
Gargantext.API.Node.DocumentUpload.Types
(
DocumentUpload
)
import
Gargantext.API.Node.DocumentUpload.Types
(
DocumentUpload
)
import
Gargantext.API.Node.FrameCalcUpload.Types
(
FrameCalcUpload
)
import
Gargantext.API.Node.FrameCalcUpload.Types
(
FrameCalcUpload
)
import
Gargantext.API.Node.New.Types
(
PostNode
(
..
)
)
import
Gargantext.API.Node.New.Types
(
PostNode
(
..
)
)
import
Gargantext.API.Node.Update.Types
(
UpdateNodeParams
)
import
Gargantext.API.Node.Types
(
NewWithFile
,
NewWithTempFile
,
WithQuery
(
..
))
import
Gargantext.API.Node.Types
(
NewWithFile
,
NewWithTempFile
,
WithQuery
(
..
))
import
Gargantext.API.Node.Update.Types
(
UpdateNodeParams
)
import
Gargantext.Core.Types.Individu
(
User
)
import
Gargantext.Core.Types.Individu
(
User
)
import
Gargantext.Database.Admin.Types.Node
(
AnnuaireId
,
CorpusId
,
ListId
,
NodeId
(
UnsafeMkNodeId
))
import
Gargantext.Database.Admin.Types.Node
(
AnnuaireId
,
CorpusId
,
ListId
,
NodeId
(
UnsafeMkNodeId
)
,
ParentId
)
import
Gargantext.Prelude
import
Gargantext.Prelude
data
ImportRemoteTermsPayload
=
ImportRemoteTermsPayload
{
_irtp_list_id
::
ListId
,
_irtp_ngrams_list
::
NgramsList
}
deriving
(
Show
,
Eq
)
instance
ToJSON
ImportRemoteTermsPayload
where
toJSON
ImportRemoteTermsPayload
{
..
}
=
object
[
"list_id"
.=
_irtp_list_id
,
"ngrams_list"
.=
_irtp_ngrams_list
]
instance
FromJSON
ImportRemoteTermsPayload
where
parseJSON
=
withObject
"ImportRemoteTermsPayload"
$
\
o
->
do
_irtp_list_id
<-
o
.:
"list_id"
_irtp_ngrams_list
<-
o
.:
"ngrams_list"
pure
ImportRemoteTermsPayload
{
..
}
data
WorkSplit
=
WorkSplit
{
_ws_current
::
Int
,
_ws_total
::
Int
}
deriving
(
Show
,
Eq
)
instance
ToJSON
WorkSplit
where
toJSON
WorkSplit
{
..
}
=
object
[
"current"
.=
_ws_current
,
"total"
.=
_ws_total
]
instance
FromJSON
WorkSplit
where
parseJSON
=
withObject
"WorkSplit"
$
\
o
->
do
_ws_current
<-
o
.:
"current"
_ws_total
<-
o
.:
"total"
pure
WorkSplit
{
..
}
data
ImportRemoteDocumentsPayload
=
ImportRemoteDocumentsPayload
{
_irdp_user
::
AuthenticatedUser
,
_irdp_parent_id
::
ParentId
,
_irdp_corpus_id
::
NodeId
,
_irdp_documents
::
[
Document
]
-- | Useful to compute total progress in logs.
,
_irdp_work_split
::
WorkSplit
}
deriving
(
Show
,
Eq
)
instance
ToJSON
ImportRemoteDocumentsPayload
where
toJSON
ImportRemoteDocumentsPayload
{
..
}
=
object
[
"user"
.=
_irdp_user
,
"corpus_id"
.=
_irdp_corpus_id
,
"parent_id"
.=
_irdp_parent_id
,
"documents"
.=
_irdp_documents
,
"work_split"
.=
_irdp_work_split
]
instance
FromJSON
ImportRemoteDocumentsPayload
where
parseJSON
=
withObject
"ImportRemoteDocumentsPayload"
$
\
o
->
do
_irdp_user
<-
o
.:
"user"
_irdp_parent_id
<-
o
.:
"parent_id"
_irdp_corpus_id
<-
o
.:
"corpus_id"
_irdp_documents
<-
o
.:
"documents"
_irdp_work_split
<-
o
.:
"work_split"
pure
ImportRemoteDocumentsPayload
{
..
}
data
Job
=
data
Job
=
Ping
Ping
...
@@ -66,6 +132,8 @@ data Job =
...
@@ -66,6 +132,8 @@ data Job =
,
_un_args
::
UpdateNodeParams
}
,
_un_args
::
UpdateNodeParams
}
|
UploadDocument
{
_ud_node_id
::
NodeId
|
UploadDocument
{
_ud_node_id
::
NodeId
,
_ud_args
::
DocumentUpload
}
,
_ud_args
::
DocumentUpload
}
|
ImportRemoteDocuments
!
ImportRemoteDocumentsPayload
|
ImportRemoteTerms
!
ImportRemoteTermsPayload
deriving
(
Show
,
Eq
)
deriving
(
Show
,
Eq
)
instance
FromJSON
Job
where
instance
FromJSON
Job
where
parseJSON
=
withObject
"Job"
$
\
o
->
do
parseJSON
=
withObject
"Job"
$
\
o
->
do
...
@@ -134,6 +202,10 @@ instance FromJSON Job where
...
@@ -134,6 +202,10 @@ instance FromJSON Job where
_ud_node_id
<-
o
.:
"node_id"
_ud_node_id
<-
o
.:
"node_id"
_ud_args
<-
o
.:
"args"
_ud_args
<-
o
.:
"args"
return
$
UploadDocument
{
..
}
return
$
UploadDocument
{
..
}
"ImportRemoteDocuments"
->
ImportRemoteDocuments
<$>
parseJSON
(
JS
.
Object
o
)
"ImportRemoteTerms"
->
ImportRemoteTerms
<$>
parseJSON
(
JS
.
Object
o
)
s
->
prependFailure
"parsing job type failed, "
(
typeMismatch
"type"
s
)
s
->
prependFailure
"parsing job type failed, "
(
typeMismatch
"type"
s
)
instance
ToJSON
Job
where
instance
ToJSON
Job
where
toJSON
Ping
=
object
[
"type"
.=
(
"Ping"
::
Text
)
]
toJSON
Ping
=
object
[
"type"
.=
(
"Ping"
::
Text
)
]
...
@@ -199,10 +271,18 @@ instance ToJSON Job where
...
@@ -199,10 +271,18 @@ instance ToJSON Job where
object
[
"type"
.=
(
"UploadDocument"
::
Text
)
object
[
"type"
.=
(
"UploadDocument"
::
Text
)
,
"node_id"
.=
_ud_node_id
,
"node_id"
.=
_ud_node_id
,
"args"
.=
_ud_args
]
,
"args"
.=
_ud_args
]
toJSON
(
ImportRemoteDocuments
payload
)
=
case
toJSON
payload
of
(
JS
.
Object
o
)
->
let
o1
=
KM
.
fromList
[
(
"type"
,
toJSON
@
T
.
Text
"ImportRemoteDocuments"
)
]
in
JS
.
Object
$
o1
<>
o
_
->
errorTrace
"impossible, toJSON ImportRemoteDocuments did not return an Object."
toJSON
(
ImportRemoteTerms
payload
)
=
case
toJSON
payload
of
(
JS
.
Object
o
)
->
let
o1
=
KM
.
fromList
[
(
"type"
,
toJSON
@
T
.
Text
"ImportRemoteTerms"
)
]
in
JS
.
Object
$
o1
<>
o
_
->
errorTrace
"impossible, toJSON ImportRemoteTerms did not return an Object."
-- | We want to have a way to specify 'Maybe NodeId' from given worker
-- | We want to have a way to specify 'Maybe NodeId' from given worker
-- parameters. The given 'Maybe CorpusId' is an alternative, when
-- parameters. The given 'Maybe CorpusId' is an alternative, when
...
@@ -226,3 +306,5 @@ getWorkerMNodeId (PostNodeAsync { _pna_node_id }) = Just _pna_node_id
...
@@ -226,3 +306,5 @@ getWorkerMNodeId (PostNodeAsync { _pna_node_id }) = Just _pna_node_id
getWorkerMNodeId
(
RecomputeGraph
{
_rg_node_id
})
=
Just
_rg_node_id
getWorkerMNodeId
(
RecomputeGraph
{
_rg_node_id
})
=
Just
_rg_node_id
getWorkerMNodeId
(
UpdateNode
{
_un_node_id
})
=
Just
_un_node_id
getWorkerMNodeId
(
UpdateNode
{
_un_node_id
})
=
Just
_un_node_id
getWorkerMNodeId
(
UploadDocument
{
_ud_node_id
})
=
Just
_ud_node_id
getWorkerMNodeId
(
UploadDocument
{
_ud_node_id
})
=
Just
_ud_node_id
getWorkerMNodeId
(
ImportRemoteDocuments
(
ImportRemoteDocumentsPayload
_
_
corpusId
_
_
))
=
Just
corpusId
getWorkerMNodeId
(
ImportRemoteTerms
(
ImportRemoteTermsPayload
listId
_
))
=
Just
listId
src/Gargantext/Core/Worker/Types.hs
View file @
b3a89d3a
{-# LANGUAGE StandaloneDeriving #-}
{-# LANGUAGE DerivingStrategies #-}
{-|
{-|
Module : Gargantext.Core.Worker.Types
Module : Gargantext.Core.Worker.Types
Description : Some useful worker types
Description : Some useful worker types
...
@@ -36,4 +38,4 @@ instance FromJSON JobInfo where
...
@@ -36,4 +38,4 @@ instance FromJSON JobInfo where
instance
ToJSON
JobInfo
where
instance
ToJSON
JobInfo
where
toJSON
(
JobInfo
{
..
})
=
object
[
"message_id"
.=
_ji_message_id
toJSON
(
JobInfo
{
..
})
=
object
[
"message_id"
.=
_ji_message_id
,
"node_id"
.=
_ji_mNode_id
]
,
"node_id"
.=
_ji_mNode_id
]
instance
NFData
JobInfo
src/Gargantext/Database/Action/Flow.hs
View file @
b3a89d3a
...
@@ -389,12 +389,6 @@ buildSocialList l user userCorpusId listId ctype mfslw = do
...
@@ -389,12 +389,6 @@ buildSocialList l user userCorpusId listId ctype mfslw = do
<-
getOrMkRootWithCorpus
MkCorpusUserMaster
ctype
<-
getOrMkRootWithCorpus
MkCorpusUserMaster
ctype
nlpServer
<-
view
(
nlpServerGet
l
)
nlpServer
<-
view
(
nlpServerGet
l
)
--let gp = (GroupParams l 2 3 (StopSize 3))
-- Here the PosTagAlgo should be chosen according to the Lang
-- let gp = GroupParams { unGroupParams_lang = l
-- , unGroupParams_len = 10
-- , unGroupParams_limit = 10
-- , unGroupParams_stopSize = StopSize 10 }
let
gp
=
GroupWithPosTag
l
nlpServer
HashMap
.
empty
let
gp
=
GroupWithPosTag
l
nlpServer
HashMap
.
empty
ngs
<-
buildNgramsLists
user
userCorpusId
masterCorpusId
mfslw
gp
ngs
<-
buildNgramsLists
user
userCorpusId
masterCorpusId
mfslw
gp
...
@@ -424,7 +418,6 @@ insertMasterDocs ncs c lang hs = do
...
@@ -424,7 +418,6 @@ insertMasterDocs ncs c lang hs = do
-- add documents to the corpus (create node_node link)
-- add documents to the corpus (create node_node link)
-- this will enable global database monitoring
-- this will enable global database monitoring
-- maps :: IO Map Ngrams (Map NgramsType (Map NodeId Int))
mapNgramsDocs'
::
HashMap
.
HashMap
ExtractedNgrams
(
Map
NgramsType
(
Map
NodeId
(
TermsWeight
,
TermsCount
)))
mapNgramsDocs'
::
HashMap
.
HashMap
ExtractedNgrams
(
Map
NgramsType
(
Map
NodeId
(
TermsWeight
,
TermsCount
)))
<-
mapNodeIdNgrams
<-
mapNodeIdNgrams
<$>
documentIdWithNgrams
<$>
documentIdWithNgrams
...
@@ -432,10 +425,8 @@ insertMasterDocs ncs c lang hs = do
...
@@ -432,10 +425,8 @@ insertMasterDocs ncs c lang hs = do
(
map
(
B
.
first
contextId2NodeId
)
documentsWithId
)
(
map
(
B
.
first
contextId2NodeId
)
documentsWithId
)
lId
<-
getOrMkList
masterCorpusId
masterUserId
lId
<-
getOrMkList
masterCorpusId
masterUserId
-- _ <- saveDocNgramsWith lId mapNgramsDocs'
_
<-
saveDocNgramsWith
lId
mapNgramsDocs'
_
<-
saveDocNgramsWith
lId
mapNgramsDocs'
-- _cooc <- insertDefaultNode NodeListCooc lId masterUserId
pure
$
map
contextId2NodeId
ids'
pure
$
map
contextId2NodeId
ids'
...
@@ -444,9 +435,6 @@ saveDocNgramsWith :: (IsDBCmd env err m)
...
@@ -444,9 +435,6 @@ saveDocNgramsWith :: (IsDBCmd env err m)
->
HashMap
.
HashMap
ExtractedNgrams
(
Map
NgramsType
(
Map
NodeId
(
TermsWeight
,
TermsCount
)))
->
HashMap
.
HashMap
ExtractedNgrams
(
Map
NgramsType
(
Map
NodeId
(
TermsWeight
,
TermsCount
)))
->
m
()
->
m
()
saveDocNgramsWith
lId
mapNgramsDocs'
=
do
saveDocNgramsWith
lId
mapNgramsDocs'
=
do
--printDebug "[saveDocNgramsWith] mapNgramsDocs'" mapNgramsDocs'
-- let mapNgramsDocsNoCount :: HashMap.HashMap ExtractedNgrams (Map NgramsType (Map NodeId Int))
-- mapNgramsDocsNoCount = over (traverse . traverse . traverse) fst mapNgramsDocs'
(
terms2id
::
HashMap
.
HashMap
Text
NgramsId
)
<-
insertExtractedNgrams
$
HashMap
.
keys
mapNgramsDocs'
(
terms2id
::
HashMap
.
HashMap
Text
NgramsId
)
<-
insertExtractedNgrams
$
HashMap
.
keys
mapNgramsDocs'
let
mapNgramsDocs
::
HashMap
.
HashMap
Ngrams
(
Map
NgramsType
(
Map
NodeId
(
TermsWeight
,
TermsCount
)))
let
mapNgramsDocs
::
HashMap
.
HashMap
Ngrams
(
Map
NgramsType
(
Map
NodeId
(
TermsWeight
,
TermsCount
)))
...
@@ -457,8 +445,6 @@ saveDocNgramsWith lId mapNgramsDocs' = do
...
@@ -457,8 +445,6 @@ saveDocNgramsWith lId mapNgramsDocs' = do
$
map
(
bimap
_ngramsTerms
Map
.
keys
)
$
map
(
bimap
_ngramsTerms
Map
.
keys
)
$
HashMap
.
toList
mapNgramsDocs
$
HashMap
.
toList
mapNgramsDocs
--printDebug "saveDocNgramsWith" mapCgramsId
-- insertDocNgrams
let
ngrams2insert
=
catMaybes
[
ContextNodeNgrams2
(
nodeId2ContextId
nId
)
let
ngrams2insert
=
catMaybes
[
ContextNodeNgrams2
(
nodeId2ContextId
nId
)
<$>
getCgramsId
mapCgramsId
ngrams_type
(
_ngramsTerms
terms''
)
<$>
getCgramsId
mapCgramsId
ngrams_type
(
_ngramsTerms
terms''
)
<*>
Just
(
fromIntegral
$
unTermsWeight
w
::
Double
)
<*>
Just
(
fromIntegral
$
unTermsWeight
w
::
Double
)
...
@@ -466,7 +452,6 @@ saveDocNgramsWith lId mapNgramsDocs' = do
...
@@ -466,7 +452,6 @@ saveDocNgramsWith lId mapNgramsDocs' = do
,
(
ngrams_type
,
mapNodeIdWeight
)
<-
Map
.
toList
mapNgramsTypes
,
(
ngrams_type
,
mapNodeIdWeight
)
<-
Map
.
toList
mapNgramsTypes
,
(
nId
,
(
w
,
_cnt
))
<-
Map
.
toList
mapNodeIdWeight
,
(
nId
,
(
w
,
_cnt
))
<-
Map
.
toList
mapNodeIdWeight
]
]
-- printDebug "Ngrams2Insert" ngrams2insert
_return
<-
insertContextNodeNgrams2
ngrams2insert
_return
<-
insertContextNodeNgrams2
ngrams2insert
-- to be removed
-- to be removed
...
@@ -501,9 +486,5 @@ reIndexWith cId lId nt lts = do
...
@@ -501,9 +486,5 @@ reIndexWith cId lId nt lts = do
-- Get all documents of the corpus
-- Get all documents of the corpus
(
docs
::
[
ContextOnlyId
HyperdataDocument
])
<-
selectDocNodesOnlyId
cId
(
docs
::
[
ContextOnlyId
HyperdataDocument
])
<-
selectDocNodesOnlyId
cId
let
ngramsByDoc'
=
ngramsByDoc
corpusLang
nt
ts
docs
-- Saving the indexation in database
-- Saving the indexation in database
mapM_
(
saveDocNgramsWith
lId
)
ngramsByDoc'
mapM_
(
saveDocNgramsWith
lId
.
ngramsByDoc
corpusLang
nt
ts
)
docs
pure
()
src/Gargantext/Database/Action/Flow/Utils.hs
View file @
b3a89d3a
...
@@ -185,23 +185,14 @@ toInserted =
...
@@ -185,23 +185,14 @@ toInserted =
-- | Given list of terms and a document, produce a map for this doc's
-- terms count and weights. Notice that the weight is always 1 here.
ngramsByDoc
::
Lang
ngramsByDoc
::
Lang
->
NgramsType
->
NgramsType
->
[
NT
.
NgramsTerm
]
->
[
NT
.
NgramsTerm
]
->
[
ContextOnlyId
HyperdataDocument
]
->
ContextOnlyId
HyperdataDocument
->
[
HashMap
.
HashMap
ExtractedNgrams
(
DM
.
Map
NgramsType
(
Map
NodeId
(
TermsWeight
,
TermsCount
)))]
->
HashMap
.
HashMap
ExtractedNgrams
(
DM
.
Map
NgramsType
(
Map
NodeId
(
TermsWeight
,
TermsCount
)))
ngramsByDoc
l
nt
ts
docs
=
ngramsByDoc
l
nt
ts
doc
=
ngramsByDoc'
l
nt
ts
<$>
docs
-- | Given list of terms and a document, produce a map for this doc's
-- terms count and weights. Notice that the weight is always 1 here.
ngramsByDoc'
::
Lang
->
NgramsType
->
[
NT
.
NgramsTerm
]
->
ContextOnlyId
HyperdataDocument
->
HashMap
.
HashMap
ExtractedNgrams
(
DM
.
Map
NgramsType
(
Map
NodeId
(
TermsWeight
,
TermsCount
)))
ngramsByDoc'
l
nt
ts
doc
=
HashMap
.
map
(
\
cnt
->
DM
.
singleton
nt
$
DM
.
singleton
nId
(
1
,
cnt
))
extractedMap
HashMap
.
map
(
\
cnt
->
DM
.
singleton
nt
$
DM
.
singleton
nId
(
1
,
cnt
))
extractedMap
where
where
matched
::
[(
MatchedText
,
TermsCount
)]
matched
::
[(
MatchedText
,
TermsCount
)]
...
...
src/Gargantext/Database/Admin/Types/Hyperdata/Any.hs
View file @
b3a89d3a
...
@@ -21,6 +21,8 @@ import Gargantext.Database.Admin.Types.Hyperdata.Prelude
...
@@ -21,6 +21,8 @@ import Gargantext.Database.Admin.Types.Hyperdata.Prelude
newtype
HyperdataAny
=
HyperdataAny
Object
newtype
HyperdataAny
=
HyperdataAny
Object
deriving
(
Show
,
Generic
,
ToJSON
,
FromJSON
)
deriving
(
Show
,
Generic
,
ToJSON
,
FromJSON
)
instance
NFData
HyperdataAny
------------------------------------------------------------------------
------------------------------------------------------------------------
-- Instances
-- Instances
------------------------------------------------------------------------
------------------------------------------------------------------------
...
...
src/Gargantext/Database/Admin/Types/Hyperdata/Contact.hs
View file @
b3a89d3a
...
@@ -39,6 +39,8 @@ data HyperdataContact =
...
@@ -39,6 +39,8 @@ data HyperdataContact =
instance
GQLType
HyperdataContact
where
instance
GQLType
HyperdataContact
where
directives
_
=
typeDirective
DropNamespace
{
dropNamespace
=
"_hc_"
}
directives
_
=
typeDirective
DropNamespace
{
dropNamespace
=
"_hc_"
}
instance
NFData
HyperdataContact
where
instance
HasText
HyperdataContact
instance
HasText
HyperdataContact
where
where
hasText
=
undefined
hasText
=
undefined
...
@@ -83,7 +85,7 @@ arbitraryHyperdataContact =
...
@@ -83,7 +85,7 @@ arbitraryHyperdataContact =
,
_hc_lastValidation
=
Nothing
}
,
_hc_lastValidation
=
Nothing
}
data
ContactWho
=
data
ContactWho
=
ContactWho
{
_cw_id
::
Maybe
Text
ContactWho
{
_cw_id
::
Maybe
Text
,
_cw_firstName
::
Maybe
Text
,
_cw_firstName
::
Maybe
Text
,
_cw_lastName
::
Maybe
Text
,
_cw_lastName
::
Maybe
Text
...
@@ -95,6 +97,8 @@ data ContactWho =
...
@@ -95,6 +97,8 @@ data ContactWho =
instance
GQLType
ContactWho
where
instance
GQLType
ContactWho
where
directives
_
=
typeDirective
DropNamespace
{
dropNamespace
=
"_cw_"
}
directives
_
=
typeDirective
DropNamespace
{
dropNamespace
=
"_cw_"
}
instance
NFData
ContactWho
where
type
FirstName
=
Text
type
FirstName
=
Text
type
LastName
=
Text
type
LastName
=
Text
...
@@ -113,15 +117,11 @@ contactWho fn ln =
...
@@ -113,15 +117,11 @@ contactWho fn ln =
data
ContactWhere
=
data
ContactWhere
=
ContactWhere
{
_cw_organization
::
[
Text
]
ContactWhere
{
_cw_organization
::
[
Text
]
,
_cw_labTeamDepts
::
[
Text
]
,
_cw_labTeamDepts
::
[
Text
]
,
_cw_role
::
Maybe
Text
,
_cw_role
::
Maybe
Text
,
_cw_office
::
Maybe
Text
,
_cw_office
::
Maybe
Text
,
_cw_country
::
Maybe
Text
,
_cw_country
::
Maybe
Text
,
_cw_city
::
Maybe
Text
,
_cw_city
::
Maybe
Text
,
_cw_touch
::
Maybe
ContactTouch
,
_cw_touch
::
Maybe
ContactTouch
,
_cw_entry
::
Maybe
NUTCTime
,
_cw_entry
::
Maybe
NUTCTime
,
_cw_exit
::
Maybe
NUTCTime
,
_cw_exit
::
Maybe
NUTCTime
}
deriving
(
Eq
,
Show
,
Generic
)
}
deriving
(
Eq
,
Show
,
Generic
)
...
@@ -129,6 +129,8 @@ data ContactWhere =
...
@@ -129,6 +129,8 @@ data ContactWhere =
instance
GQLType
ContactWhere
where
instance
GQLType
ContactWhere
where
directives
_
=
typeDirective
DropNamespace
{
dropNamespace
=
"_cw_"
}
directives
_
=
typeDirective
DropNamespace
{
dropNamespace
=
"_cw_"
}
instance
NFData
ContactWhere
where
defaultContactWhere
::
ContactWhere
defaultContactWhere
::
ContactWhere
defaultContactWhere
=
defaultContactWhere
=
ContactWhere
ContactWhere
...
@@ -151,6 +153,8 @@ data ContactTouch =
...
@@ -151,6 +153,8 @@ data ContactTouch =
instance
GQLType
ContactTouch
where
instance
GQLType
ContactTouch
where
directives
_
=
typeDirective
DropNamespace
{
dropNamespace
=
"_ct_"
}
directives
_
=
typeDirective
DropNamespace
{
dropNamespace
=
"_ct_"
}
instance
NFData
ContactTouch
where
defaultContactTouch
::
ContactTouch
defaultContactTouch
::
ContactTouch
defaultContactTouch
=
defaultContactTouch
=
ContactTouch
ContactTouch
...
...
src/Gargantext/Database/Admin/Types/Hyperdata/Document.hs
View file @
b3a89d3a
...
@@ -17,6 +17,7 @@ import Gargantext.Prelude hiding (ByteString)
...
@@ -17,6 +17,7 @@ import Gargantext.Prelude hiding (ByteString)
import
Gargantext.Core.Text
(
HasText
(
..
))
import
Gargantext.Core.Text
(
HasText
(
..
))
import
Gargantext.Core.Utils.Prefix
(
unCapitalize
,
dropPrefix
)
import
Gargantext.Core.Utils.Prefix
(
unCapitalize
,
dropPrefix
)
import
Gargantext.Database.Admin.Types.Hyperdata.Prelude
import
Gargantext.Database.Admin.Types.Hyperdata.Prelude
import
Codec.Serialise.Class
hiding
(
decode
)
------------------------------------------------------------------------
------------------------------------------------------------------------
data
HyperdataDocument
=
HyperdataDocument
{
_hd_bdd
::
!
(
Maybe
Text
)
data
HyperdataDocument
=
HyperdataDocument
{
_hd_bdd
::
!
(
Maybe
Text
)
...
@@ -40,6 +41,8 @@ data HyperdataDocument = HyperdataDocument { _hd_bdd :: !(Maybe T
...
@@ -40,6 +41,8 @@ data HyperdataDocument = HyperdataDocument { _hd_bdd :: !(Maybe T
}
}
deriving
(
Show
,
Generic
)
deriving
(
Show
,
Generic
)
instance
NFData
HyperdataDocument
instance
Serialise
HyperdataDocument
instance
HasText
HyperdataDocument
instance
HasText
HyperdataDocument
where
where
...
...
src/Gargantext/Database/Admin/Types/Hyperdata/User.hs
View file @
b3a89d3a
...
@@ -40,6 +40,8 @@ data HyperdataUser =
...
@@ -40,6 +40,8 @@ data HyperdataUser =
instance
GQLType
HyperdataUser
where
instance
GQLType
HyperdataUser
where
directives
_
=
typeDirective
DropNamespace
{
dropNamespace
=
"_hu_"
}
directives
_
=
typeDirective
DropNamespace
{
dropNamespace
=
"_hu_"
}
instance
NFData
HyperdataUser
where
data
HyperdataPrivate
=
data
HyperdataPrivate
=
HyperdataPrivate
{
_hpr_password
::
!
Text
HyperdataPrivate
{
_hpr_password
::
!
Text
,
_hpr_lang
::
!
Lang
,
_hpr_lang
::
!
Lang
...
@@ -49,6 +51,8 @@ data HyperdataPrivate =
...
@@ -49,6 +51,8 @@ data HyperdataPrivate =
instance
GQLType
HyperdataPrivate
where
instance
GQLType
HyperdataPrivate
where
directives
_
=
typeDirective
DropNamespace
{
dropNamespace
=
"_hpr_"
}
directives
_
=
typeDirective
DropNamespace
{
dropNamespace
=
"_hpr_"
}
instance
NFData
HyperdataPrivate
where
data
HyperdataPublic
=
data
HyperdataPublic
=
HyperdataPublic
{
_hpu_pseudo
::
!
Text
HyperdataPublic
{
_hpu_pseudo
::
!
Text
...
@@ -59,6 +63,8 @@ data HyperdataPublic =
...
@@ -59,6 +63,8 @@ data HyperdataPublic =
instance
GQLType
HyperdataPublic
where
instance
GQLType
HyperdataPublic
where
directives
_
=
typeDirective
DropNamespace
{
dropNamespace
=
"_hpu_"
}
directives
_
=
typeDirective
DropNamespace
{
dropNamespace
=
"_hpu_"
}
instance
NFData
HyperdataPublic
where
-- | Default
-- | Default
defaultHyperdataUser
::
HyperdataUser
defaultHyperdataUser
::
HyperdataUser
defaultHyperdataUser
=
defaultHyperdataUser
=
...
...
src/Gargantext/Database/Admin/Types/Node.hs
View file @
b3a89d3a
...
@@ -69,6 +69,9 @@ newtype UserId = UnsafeMkUserId { _UserId :: Int }
...
@@ -69,6 +69,9 @@ newtype UserId = UnsafeMkUserId { _UserId :: Int }
deriving
stock
(
Show
,
Eq
,
Ord
,
Generic
)
deriving
stock
(
Show
,
Eq
,
Ord
,
Generic
)
deriving
newtype
(
ToSchema
,
ToJSON
,
FromJSON
,
FromField
,
ToField
,
Hashable
)
deriving
newtype
(
ToSchema
,
ToJSON
,
FromJSON
,
FromField
,
ToField
,
Hashable
)
instance
NFData
UserId
where
instance
Serialise
UserId
where
-- The 'UserId' is isomprohic to an 'Int'.
-- The 'UserId' is isomprohic to an 'Int'.
instance
GQLType
UserId
where
instance
GQLType
UserId
where
type
KIND
UserId
=
SCALAR
type
KIND
UserId
=
SCALAR
...
@@ -258,6 +261,8 @@ newtype NodeId = UnsafeMkNodeId { _NodeId :: Int }
...
@@ -258,6 +261,8 @@ newtype NodeId = UnsafeMkNodeId { _NodeId :: Int }
deriving
newtype
(
Num
,
Enum
,
ToJSONKey
,
FromJSONKey
,
ToJSON
,
FromJSON
,
Hashable
,
Csv
.
ToField
)
deriving
newtype
(
Num
,
Enum
,
ToJSONKey
,
FromJSONKey
,
ToJSON
,
FromJSON
,
Hashable
,
Csv
.
ToField
)
deriving
anyclass
(
ToExpr
)
deriving
anyclass
(
ToExpr
)
instance
NFData
NodeId
where
instance
ResourceId
NodeId
where
instance
ResourceId
NodeId
where
isPositive
=
(
>
0
)
.
_NodeId
isPositive
=
(
>
0
)
.
_NodeId
...
@@ -292,6 +297,7 @@ newtype ContextId = UnsafeMkContextId { _ContextId :: Int }
...
@@ -292,6 +297,7 @@ newtype ContextId = UnsafeMkContextId { _ContextId :: Int }
deriving
FromField
via
NodeId
deriving
FromField
via
NodeId
instance
ToParamSchema
ContextId
instance
ToParamSchema
ContextId
instance
NFData
ContextId
instance
Arbitrary
ContextId
where
instance
Arbitrary
ContextId
where
arbitrary
=
UnsafeMkContextId
.
getPositive
<$>
arbitrary
arbitrary
=
UnsafeMkContextId
.
getPositive
<$>
arbitrary
...
@@ -444,6 +450,7 @@ data NodeType
...
@@ -444,6 +450,7 @@ data NodeType
deriving
(
Show
,
Read
,
Eq
,
Ord
,
Generic
,
Bounded
,
Enum
)
deriving
(
Show
,
Read
,
Eq
,
Ord
,
Generic
,
Bounded
,
Enum
)
instance
GQLType
NodeType
instance
GQLType
NodeType
instance
NFData
NodeType
where
-- /NOTE/ (adn) For backward-compatibility reasons, we keep the format for ToJSON/FromJSON similar
-- /NOTE/ (adn) For backward-compatibility reasons, we keep the format for ToJSON/FromJSON similar
-- to what 'Show/Read' would generate, but we otherwise generate \"by hand\" the mapping between a
-- to what 'Show/Read' would generate, but we otherwise generate \"by hand\" the mapping between a
...
@@ -651,6 +658,8 @@ data NodePublishPolicy
...
@@ -651,6 +658,8 @@ data NodePublishPolicy
|
NPP_publish_edits_only_owner_or_super
|
NPP_publish_edits_only_owner_or_super
deriving
(
Show
,
Read
,
Generic
,
Eq
,
Ord
,
Enum
,
Bounded
)
deriving
(
Show
,
Read
,
Generic
,
Eq
,
Ord
,
Enum
,
Bounded
)
instance
NFData
NodePublishPolicy
where
instance
HasDBid
NodePublishPolicy
where
instance
HasDBid
NodePublishPolicy
where
toDBid
=
\
case
toDBid
=
\
case
NPP_publish_no_edits_allowed
NPP_publish_no_edits_allowed
...
...
src/Gargantext/Database/Query/Facet/Types.hs
View file @
b3a89d3a
...
@@ -56,6 +56,8 @@ data Facet id date hyperdata score =
...
@@ -56,6 +56,8 @@ data Facet id date hyperdata score =
} deriving (Show, Generic)
} deriving (Show, Generic)
-}
-}
instance
(
NFData
id
,
NFData
created
,
NFData
title
,
NFData
hyper
,
NFData
cat
,
NFData
count
,
NFData
score
)
=>
NFData
(
Facet
id
created
title
hyper
cat
count
score
)
where
data
Pair
i
l
=
Pair
{
data
Pair
i
l
=
Pair
{
...
...
src/Gargantext/Database/Query/Table/Node.hs
View file @
b3a89d3a
...
@@ -28,6 +28,7 @@ module Gargantext.Database.Query.Table.Node
...
@@ -28,6 +28,7 @@ module Gargantext.Database.Query.Table.Node
,
getClosestParentIdByType'
,
getClosestParentIdByType'
,
getCorporaWithParentId
,
getCorporaWithParentId
,
getNode
,
getNode
,
getNodes
,
getParent
,
getParent
,
getNodeWith
,
getNodeWith
,
getNodeWithType
,
getNodeWithType
...
@@ -54,6 +55,7 @@ module Gargantext.Database.Query.Table.Node
...
@@ -54,6 +55,7 @@ module Gargantext.Database.Query.Table.Node
,
insertDefaultNodeIfNotExists
,
insertDefaultNodeIfNotExists
,
insertNode
,
insertNode
,
insertNodesWithParentR
,
insertNodesWithParentR
,
insertNodeWithHyperdata
-- * Deleting one or more nodes
-- * Deleting one or more nodes
,
deleteNode
,
deleteNode
...
@@ -83,7 +85,7 @@ import Gargantext.Database.Admin.Types.Hyperdata.List ( HyperdataList )
...
@@ -83,7 +85,7 @@ import Gargantext.Database.Admin.Types.Hyperdata.List ( HyperdataList )
import
Gargantext.Database.Admin.Types.Hyperdata.Prelude
(
Hyperdata
)
import
Gargantext.Database.Admin.Types.Hyperdata.Prelude
(
Hyperdata
)
import
Gargantext.Database.Prelude
(
DBCmd
,
JSONB
,
mkCmd
,
execPGSQuery
,
runPGSQuery
,
runOpaQuery
)
import
Gargantext.Database.Prelude
(
DBCmd
,
JSONB
,
mkCmd
,
execPGSQuery
,
runPGSQuery
,
runOpaQuery
)
import
Gargantext.Database.Query.Filter
(
limit'
,
offset'
)
import
Gargantext.Database.Query.Filter
(
limit'
,
offset'
)
import
Gargantext.Database.Query.Table.Node.Children
(
getChildrenById
)
import
Gargantext.Database.Query.Table.Node.Children
(
getChildrenBy
Parent
Id
)
import
Gargantext.Database.Query.Table.Node.Error
import
Gargantext.Database.Query.Table.Node.Error
import
Gargantext.Database.Schema.Node
import
Gargantext.Database.Schema.Node
import
Gargantext.Prelude
hiding
(
sum
,
head
)
import
Gargantext.Prelude
hiding
(
sum
,
head
)
...
@@ -312,6 +314,15 @@ getNode nId = do
...
@@ -312,6 +314,15 @@ getNode nId = do
Nothing
->
nodeError
(
DoesNotExist
nId
)
Nothing
->
nodeError
(
DoesNotExist
nId
)
Just
r
->
pure
r
Just
r
->
pure
r
-- | Get the nodes recursively, as a hierarchical tree.
getNodes
::
HasNodeError
err
=>
NodeId
->
DBCmd
err
(
Tree
(
Node
Value
))
getNodes
nId
=
do
root
<-
getNode
nId
children
<-
getChildrenByParentId
nId
case
children
of
[]
->
pure
$
TreeN
root
[]
xs
->
TreeN
root
<$>
forM
xs
getNodes
-- | Get the parent of a given 'Node', failing if this was called
-- | Get the parent of a given 'Node', failing if this was called
-- on a root node.
-- on a root node.
getParent
::
HasNodeError
err
=>
Node
a
->
DBCmd
err
(
Node
Value
)
getParent
::
HasNodeError
err
=>
Node
a
->
DBCmd
err
(
Node
Value
)
...
@@ -345,19 +356,24 @@ insertDefaultNodeIfNotExists nt p u = do
...
@@ -345,19 +356,24 @@ insertDefaultNodeIfNotExists nt p u = do
insertNode
::
(
HasDBid
NodeType
,
HasNodeError
err
)
insertNode
::
(
HasDBid
NodeType
,
HasNodeError
err
)
=>
NodeType
->
Maybe
Name
->
Maybe
DefaultHyperdata
->
ParentId
->
UserId
->
DBCmd
err
NodeId
=>
NodeType
->
Maybe
Name
->
Maybe
DefaultHyperdata
->
ParentId
->
UserId
->
DBCmd
err
NodeId
insertNode
nt
n
h
p
u
=
do
insertNode
nt
n
h
p
u
=
insertNodeWithHyperdata
nt
n'
h'
(
Just
p
)
u
res
<-
insertNodesR
[
nodeW
nt
n
h
p
u
]
case
res
of
[
x
]
->
pure
x
_
->
nodeError
$
NodeCreationFailed
$
InsertNodeFailed
u
p
nodeW
::
HasDBid
NodeType
=>
NodeType
->
Maybe
Name
->
Maybe
DefaultHyperdata
->
ParentId
->
UserId
->
NodeWrite
nodeW
nt
n
h
p
u
=
node
nt
n'
h'
(
Just
p
)
u
where
where
n'
=
fromMaybe
(
defaultName
nt
)
n
n'
=
fromMaybe
(
defaultName
nt
)
n
h'
=
maybe
(
defaultHyperdata
nt
)
identity
h
h'
=
maybe
(
defaultHyperdata
nt
)
identity
h
insertNodeWithHyperdata
::
(
ToJSON
h
,
Hyperdata
h
,
HasDBid
NodeType
,
HasNodeError
err
)
=>
NodeType
->
Name
->
h
->
Maybe
ParentId
->
UserId
->
DBCmd
err
NodeId
insertNodeWithHyperdata
nt
n
h
p
u
=
do
res
<-
insertNodesR
[
node
nt
n
h
p
u
]
case
res
of
[
x
]
->
pure
x
_
->
nodeError
$
NodeCreationFailed
$
InsertNodeFailed
u
p
------------------------------------------------------------------------
------------------------------------------------------------------------
node
::
(
ToJSON
a
,
Hyperdata
a
,
HasDBid
NodeType
)
node
::
(
ToJSON
a
,
Hyperdata
a
,
HasDBid
NodeType
)
=>
NodeType
=>
NodeType
...
@@ -488,7 +504,7 @@ copyNode copySubtree smart idToCopy newParentId = if copySubtree
...
@@ -488,7 +504,7 @@ copyNode copySubtree smart idToCopy newParentId = if copySubtree
then
do
then
do
-- Non-recursively copy the node itself, then recursively copy its children:
-- Non-recursively copy the node itself, then recursively copy its children:
copiedNode
<-
copyNode
False
smart
idToCopy
newParentId
copiedNode
<-
copyNode
False
smart
idToCopy
newParentId
children
<-
getChildrenById
idToCopy
children
<-
getChildrenBy
Parent
Id
idToCopy
for_
children
$
\
child
->
copyNode
True
smart
child
copiedNode
for_
children
$
\
child
->
copyNode
True
smart
child
copiedNode
return
copiedNode
return
copiedNode
-- Single-node (non-recursive) copy:
-- Single-node (non-recursive) copy:
...
...
src/Gargantext/Database/Query/Table/Node/Children.hs
View file @
b3a89d3a
...
@@ -63,9 +63,9 @@ getChildren a b c d e = getChildrenNode a b c d e
...
@@ -63,9 +63,9 @@ getChildren a b c d e = getChildrenNode a b c d e
-- | Get the list of (IDs of) children of a given node (ID)
-- | Get the list of (IDs of) children of a given node (ID)
getChildrenById
::
NodeId
-- ^ ID of the parent node
getChildrenBy
Parent
Id
::
NodeId
-- ^ ID of the parent node
->
DBCmd
err
[
NodeId
]
-- ^ List of IDs of the children nodes
->
DBCmd
err
[
NodeId
]
-- ^ List of IDs of the children nodes
getChildrenById
parentId
=
runPGSQuery
getChildrenBy
Parent
Id
parentId
=
runPGSQuery
[
sql
|
SELECT id FROM public.nodes WHERE parent_id = ?;
|]
[
sql
|
SELECT id FROM public.nodes WHERE parent_id = ?;
|]
parentId
parentId
...
...
src/Gargantext/Database/Query/Table/Node/Error.hs
View file @
b3a89d3a
...
@@ -39,7 +39,7 @@ data NodeCreationError
...
@@ -39,7 +39,7 @@ data NodeCreationError
=
UserParentAlreadyExists
UserId
ParentId
=
UserParentAlreadyExists
UserId
ParentId
|
UserParentDoesNotExist
UserId
|
UserParentDoesNotExist
UserId
|
UserHasNegativeId
UserId
|
UserHasNegativeId
UserId
|
InsertNodeFailed
UserId
ParentId
|
InsertNodeFailed
UserId
(
Maybe
ParentId
)
deriving
(
Show
,
Eq
,
Generic
)
deriving
(
Show
,
Eq
,
Generic
)
instance
ToJSON
NodeCreationError
instance
ToJSON
NodeCreationError
...
@@ -85,6 +85,7 @@ data NodeError = NoListFound ListId
...
@@ -85,6 +85,7 @@ data NodeError = NoListFound ListId
|
DoesNotExist
NodeId
|
DoesNotExist
NodeId
|
NodeIsReadOnly
NodeId
T
.
Text
|
NodeIsReadOnly
NodeId
T
.
Text
|
MoveError
NodeId
NodeId
T
.
Text
|
MoveError
NodeId
NodeId
T
.
Text
|
NodeNotExportable
NodeId
T
.
Text
instance
Prelude
.
Show
NodeError
instance
Prelude
.
Show
NodeError
where
where
...
@@ -101,6 +102,7 @@ instance Prelude.Show NodeError
...
@@ -101,6 +102,7 @@ instance Prelude.Show NodeError
show
(
DoesNotExist
n
)
=
"Node does not exist ("
<>
show
n
<>
")"
show
(
DoesNotExist
n
)
=
"Node does not exist ("
<>
show
n
<>
")"
show
(
NodeIsReadOnly
n
reason
)
=
"Node "
<>
show
n
<>
" is read only, edits not allowed. Reason: "
<>
T
.
unpack
reason
show
(
NodeIsReadOnly
n
reason
)
=
"Node "
<>
show
n
<>
" is read only, edits not allowed. Reason: "
<>
T
.
unpack
reason
show
(
MoveError
s
t
reason
)
=
"Moving "
<>
show
s
<>
" to "
<>
show
t
<>
" failed: "
<>
T
.
unpack
reason
show
(
MoveError
s
t
reason
)
=
"Moving "
<>
show
s
<>
" to "
<>
show
t
<>
" failed: "
<>
T
.
unpack
reason
show
(
NodeNotExportable
nid
reason
)
=
"Node "
<>
show
nid
<>
" is not exportable: "
<>
show
reason
instance
ToJSON
NodeError
where
instance
ToJSON
NodeError
where
toJSON
(
DoesNotExist
n
)
=
toJSON
(
DoesNotExist
n
)
=
...
...
src/Gargantext/Database/Schema/Context.hs
View file @
b3a89d3a
...
@@ -54,6 +54,8 @@ $(makeLensesWith abbreviatedFields ''ContextPoly)
...
@@ -54,6 +54,8 @@ $(makeLensesWith abbreviatedFields ''ContextPoly)
------------------------------------------------------------------------
------------------------------------------------------------------------
-- | This datatype describes queries in the `contexts` table, where
-- only `id` and `hyperdata` are fetched.
data
ContextPolyOnlyId
id
hyperdata
=
data
ContextPolyOnlyId
id
hyperdata
=
ContextOnlyId
{
_context_oid_id
::
!
id
ContextOnlyId
{
_context_oid_id
::
!
id
,
_context_oid_hyperdata
::
!
hyperdata
}
,
_context_oid_hyperdata
::
!
hyperdata
}
...
...
src/Gargantext/Database/Schema/Node.hs
View file @
b3a89d3a
...
@@ -16,9 +16,14 @@ Portability : POSIX
...
@@ -16,9 +16,14 @@ Portability : POSIX
module
Gargantext.Database.Schema.Node
where
module
Gargantext.Database.Schema.Node
where
import
Codec.CBOR.JSON
qualified
as
CBOR
import
Codec.Serialise
import
Control.Lens
hiding
(
elements
,
(
&
))
import
Control.Lens
hiding
(
elements
,
(
&
))
import
Data.Aeson
(
ToJSON
,
toJSON
,
parseJSON
,
FromJSON
)
import
Gargantext.Database.Schema.Prelude
import
Gargantext.Database.Schema.Prelude
import
Gargantext.Prelude
(
NFData
(
..
))
import
Prelude
hiding
(
null
,
id
,
map
,
sum
)
import
Prelude
hiding
(
null
,
id
,
map
,
sum
)
import
Data.Aeson.Types
(
parseEither
)
------------------------------------------------------------------------
------------------------------------------------------------------------
-- Main polymorphic Node definition
-- Main polymorphic Node definition
...
@@ -43,6 +48,41 @@ data NodePoly id
...
@@ -43,6 +48,41 @@ data NodePoly id
,
_node_hyperdata
::
!
hyperdata
,
_node_hyperdata
::
!
hyperdata
}
deriving
(
Show
,
Generic
)
}
deriving
(
Show
,
Generic
)
instance
(
NFData
i
,
NFData
h
,
NFData
t
,
NFData
u
,
NFData
p
,
NFData
n
,
NFData
d
,
NFData
hy
)
=>
NFData
(
NodePoly
i
h
t
u
p
n
d
hy
)
where
instance
(
Serialise
i
,
Serialise
h
,
Serialise
t
,
Serialise
u
,
Serialise
p
,
Serialise
n
,
Serialise
d
,
ToJSON
json
,
FromJSON
json
)
=>
Serialise
(
NodePoly
i
h
t
u
p
n
d
json
)
where
encode
Node
{
..
}
=
encode
_node_id
<>
encode
_node_hash_id
<>
encode
_node_typename
<>
encode
_node_user_id
<>
encode
_node_parent_id
<>
encode
_node_name
<>
encode
_node_date
<>
CBOR
.
encodeValue
(
toJSON
_node_hyperdata
)
decode
=
do
_node_id
<-
decode
_node_hash_id
<-
decode
_node_typename
<-
decode
_node_user_id
<-
decode
_node_parent_id
<-
decode
_node_name
<-
decode
_node_date
<-
decode
mb_node_hyperdata
<-
parseEither
parseJSON
<$>
CBOR
.
decodeValue
False
case
mb_node_hyperdata
of
Left
err
->
fail
err
Right
_node_hyperdata
->
pure
Node
{
..
}
------------------------------------------------------------------------
------------------------------------------------------------------------
-- Automatic instances derivation
-- Automatic instances derivation
$
(
deriveJSON
(
unPrefix
"_node_"
)
''
N
odePoly
)
$
(
deriveJSON
(
unPrefix
"_node_"
)
''
N
odePoly
)
...
...
src/Gargantext/Orphans.hs
View file @
b3a89d3a
{-# OPTIONS_GHC -Wno-orphans #-}
module
Gargantext.Orphans
(
module
Gargantext.Orphans
(
module
Gargantext
.
Orphans
.
OpenAPI
module
Gargantext
.
Orphans
.
OpenAPI
)
where
)
where
import
Data.Aeson
qualified
as
JSON
import
Gargantext.Database.Admin.Types.Hyperdata
(
Hyperdata
)
import
Gargantext.Orphans.OpenAPI
import
Gargantext.Orphans.OpenAPI
instance
Hyperdata
JSON
.
Value
src/Gargantext/Orphans/OpenAPI.hs
View file @
b3a89d3a
...
@@ -7,16 +7,17 @@
...
@@ -7,16 +7,17 @@
{-# LANGUAGE LambdaCase #-}
{-# LANGUAGE LambdaCase #-}
module
Gargantext.Orphans.OpenAPI
where
module
Gargantext.Orphans.OpenAPI
where
import
Conduit
qualified
as
C
import
Control.Lens
import
Control.Lens
import
Data.HashMap.Strict.InsOrd
qualified
as
HM
import
Data.HashMap.Strict.InsOrd
qualified
as
HM
import
Data.OpenApi
as
OpenAPI
hiding
(
Header
,
Server
)
import
Data.OpenApi
as
OpenAPI
hiding
(
Header
,
Server
)
import
Data.OpenApi.Declare
import
Data.OpenApi.Declare
import
Data.Swagger.Declare
qualified
as
SwaggerDeclare
import
Data.Swagger.Internal
qualified
as
Swagger
import
Data.Swagger.Internal
qualified
as
Swagger
import
Data.Swagger
qualified
as
Swagger
import
Data.Swagger
qualified
as
Swagger
import
Data.Text
qualified
as
T
import
Data.Text
qualified
as
T
import
Data.Typeable
import
Data.Typeable
import
Prelude
import
Prelude
import
qualified
Data.Swagger.Declare
as
SwaggerDeclare
import
Servant.API
import
Servant.API
import
Servant.Auth
import
Servant.Auth
import
Servant.OpenApi
import
Servant.OpenApi
...
@@ -85,6 +86,9 @@ class SwaggerConvertible a b where
...
@@ -85,6 +86,9 @@ class SwaggerConvertible a b where
-- Instances
-- Instances
--
--
instance
Typeable
b
=>
ToSchema
(
C
.
ConduitT
()
b
IO
()
)
where
declareNamedSchema
_
=
pure
$
NamedSchema
Nothing
binarySchema
instance
SwaggerConvertible
OpenAPI
.
Discriminator
T
.
Text
where
instance
SwaggerConvertible
OpenAPI
.
Discriminator
T
.
Text
where
swagConv
=
iso
OpenAPI
.
_discriminatorPropertyName
convertDiscriminator
swagConv
=
iso
OpenAPI
.
_discriminatorPropertyName
convertDiscriminator
where
where
...
...
src/Gargantext/Utils/UTCTime.hs
View file @
b3a89d3a
...
@@ -35,6 +35,7 @@ import Test.QuickCheck hiding (label)
...
@@ -35,6 +35,7 @@ import Test.QuickCheck hiding (label)
newtype
NUTCTime
=
NUTCTime
UTCTime
newtype
NUTCTime
=
NUTCTime
UTCTime
deriving
(
Eq
,
Show
,
Generic
)
deriving
(
Eq
,
Show
,
Generic
)
deriving
newtype
NFData
instance
DecodeScalar
NUTCTime
where
instance
DecodeScalar
NUTCTime
where
decodeScalar
(
DMT
.
String
x
)
=
case
(
readEither
$
T
.
unpack
x
)
of
decodeScalar
(
DMT
.
String
x
)
=
case
(
readEither
$
T
.
unpack
x
)
of
Right
r
->
pure
$
NUTCTime
r
Right
r
->
pure
$
NUTCTime
r
...
...
stack.yaml
View file @
b3a89d3a
...
@@ -109,6 +109,7 @@
...
@@ -109,6 +109,7 @@
-
"
servant-auth-swagger-0.2.11.0"
-
"
servant-auth-swagger-0.2.11.0"
-
"
servant-client-0.20.2"
-
"
servant-client-0.20.2"
-
"
servant-client-core-0.20.2"
-
"
servant-client-core-0.20.2"
-
"
servant-conduit-0.16.1"
-
"
servant-ekg-0.3.1"
-
"
servant-ekg-0.3.1"
-
"
servant-server-0.20.2"
-
"
servant-server-0.20.2"
-
"
servant-swagger-1.2.1"
-
"
servant-swagger-1.2.1"
...
@@ -439,7 +440,7 @@ flags:
...
@@ -439,7 +440,7 @@ flags:
formatting
:
formatting
:
"
no-double-conversion"
:
false
"
no-double-conversion"
:
false
gargantext
:
gargantext
:
"
no-phylo-debug-logs"
:
fals
e
"
no-phylo-debug-logs"
:
tru
e
"
test-crypto"
:
false
"
test-crypto"
:
false
graphviz
:
graphviz
:
"
test-parsing"
:
false
"
test-parsing"
:
false
...
...
start
View file @
b3a89d3a
...
@@ -16,7 +16,7 @@ docker compose up -d
...
@@ -16,7 +16,7 @@ docker compose up -d
echo
"GarganText: docker for postgresql database [OK]"
echo
"GarganText: docker for postgresql database [OK]"
cd
../../
cd
../../
echo
"GarganText: gargantext-server with Nix and Cabal..."
echo
"GarganText: gargantext-server with Nix and Cabal..."
nix-shell
--run
"cabal
run gargantext-server -- --toml gargantext-settings.toml --run Prod +RTS
>
$LOGFILE
2>&1 & tail -F
$LOGFILE
"
nix-shell
--run
"cabal
v2-run gargantext -- server start-all --mode Prod --settings-path gargantext-settings.toml
>
$LOGFILE
2>&1 & tail -F
$LOGFILE
"
echo
"GarganText: gargantext-server with Nix and Cabal [OK]"
echo
"GarganText: gargantext-server with Nix and Cabal [OK]"
echo
"GarganText: project stopped."
echo
"GarganText: project stopped."
test/Test/API/Authentication.hs
View file @
b3a89d3a
...
@@ -25,7 +25,7 @@ import Network.HTTP.Client hiding (Proxy)
...
@@ -25,7 +25,7 @@ import Network.HTTP.Client hiding (Proxy)
import
Network.HTTP.Types.Status
(
status403
)
import
Network.HTTP.Types.Status
(
status403
)
import
Prelude
qualified
import
Prelude
qualified
import
Servant.Auth.Client
()
import
Servant.Auth.Client
()
import
Servant.Client
import
Servant.Client
.Streaming
import
Servant.Client.Core.Response
qualified
as
SR
import
Servant.Client.Core.Response
qualified
as
SR
import
Servant.Client.Generic
(
genericClient
)
import
Servant.Client.Generic
(
genericClient
)
import
Test.API.Routes
(
auth_api
)
import
Test.API.Routes
(
auth_api
)
...
...
test/Test/API/Private.hs
View file @
b3a89d3a
...
@@ -16,10 +16,11 @@ import Gargantext.Database.Admin.Types.Hyperdata (HyperdataUser)
...
@@ -16,10 +16,11 @@ import Gargantext.Database.Admin.Types.Hyperdata (HyperdataUser)
import
Gargantext.Prelude
hiding
(
get
)
import
Gargantext.Prelude
hiding
(
get
)
import
Network.HTTP.Client
hiding
(
Proxy
)
import
Network.HTTP.Client
hiding
(
Proxy
)
import
Servant.Auth.Client
()
import
Servant.Auth.Client
()
import
Servant.Client
import
Servant.Client
.Streaming
import
Servant.Client.Generic
(
genericClient
)
import
Servant.Client.Generic
(
genericClient
)
import
Test.API.Prelude
import
Test.API.Prelude
import
Test.API.Private.Move
qualified
as
Move
import
Test.API.Private.Move
qualified
as
Move
import
Test.API.Private.Remote
qualified
as
Remote
import
Test.API.Private.Share
qualified
as
Share
import
Test.API.Private.Share
qualified
as
Share
import
Test.API.Private.Table
qualified
as
Table
import
Test.API.Private.Table
qualified
as
Table
import
Test.API.Routes
(
mkUrl
,
get_node
,
get_tree
)
import
Test.API.Routes
(
mkUrl
,
get_node
,
get_tree
)
...
@@ -111,3 +112,5 @@ tests = sequential $ do
...
@@ -111,3 +112,5 @@ tests = sequential $ do
Table
.
tests
Table
.
tests
describe
"Move API"
$
do
describe
"Move API"
$
do
Move
.
tests
Move
.
tests
describe
"Remote API"
$
do
Remote
.
tests
test/Test/API/Private/Move.hs
View file @
b3a89d3a
...
@@ -11,7 +11,7 @@ import Gargantext.Core.Types
...
@@ -11,7 +11,7 @@ import Gargantext.Core.Types
import
Gargantext.Core.Types.Individu
import
Gargantext.Core.Types.Individu
import
Gargantext.Database.Query.Table.NodeNode
(
SourceId
(
..
),
TargetId
(
..
))
import
Gargantext.Database.Query.Table.NodeNode
(
SourceId
(
..
),
TargetId
(
..
))
import
Gargantext.Prelude
import
Gargantext.Prelude
import
Servant.Client
import
Servant.Client
.Streaming
import
Test.API.Prelude
import
Test.API.Prelude
import
Test.API.Routes
import
Test.API.Routes
import
Test.API.Setup
import
Test.API.Setup
...
...
test/Test/API/Private/Remote.hs
0 → 100644
View file @
b3a89d3a
{-# LANGUAGE BangPatterns #-}
{-# LANGUAGE TypeApplications #-}
{-# LANGUAGE TypeFamilies #-}
module
Test.API.Private.Remote
(
tests
)
where
import
Control.Lens
import
Gargantext.API.Admin.EnvTypes
(
Mode
(
..
))
import
Gargantext.API.Errors
import
Gargantext.API
(
makeApp
)
import
Gargantext.API.Routes.Client
(
remoteExportClient
)
import
Gargantext.API.Routes.Named.Remote
(
RemoteExportRequest
(
..
))
import
Gargantext.Core.Types.Individu
import
Gargantext.Core.Types
(
NodeId
(
UnsafeMkNodeId
))
import
Gargantext.Prelude
import
Gargantext.System.Logging
import
Network.Wai.Handler.Warp
qualified
as
Warp
import
Network.Wai
qualified
as
Wai
import
Servant.Client.Streaming
import
Test.API.Prelude
import
Test.API.Setup
import
Test.Database.Setup
import
Test.Database.Types
import
Test.Hspec
(
Spec
,
it
,
aroundAll
,
describe
,
sequential
,
shouldBe
)
import
Test.Hspec.Wai.Internal
(
withApplication
)
import
Test.Utils
-- | Helper to let us test transferring data between two instances.
withTwoServerInstances
::
(
SpecContext
(
TestEnv
,
Wai
.
Application
,
Warp
.
Port
)
->
IO
()
)
->
IO
()
withTwoServerInstances
action
=
withTestDB
$
\
testEnv1
->
do
withTestDB
$
\
testEnv2
->
do
garg1App
<-
withLoggerHoisted
Mock
$
\
ioLogger
->
do
env
<-
newTestEnv
testEnv1
ioLogger
server1Port
makeApp
env
garg2App
<-
withLoggerHoisted
Mock
$
\
ioLogger
->
do
env
<-
newTestEnv
testEnv2
ioLogger
server2Port
makeApp
env
testWithApplicationOnPort
(
pure
garg1App
)
server1Port
$
testWithApplicationOnPort
(
pure
garg2App
)
server2Port
$
action
(
SpecContext
testEnv1
server1Port
garg1App
(
testEnv2
,
garg2App
,
server2Port
))
where
server1Port
=
8008
server2Port
=
9008
tests
::
Spec
tests
=
sequential
$
aroundAll
withTwoServerInstances
$
do
describe
"Prelude"
$
do
it
"setup DB triggers"
$
\
SpecContext
{
..
}
->
do
forM_
[
_sctx_env
,
_sctx_data
^.
_1
]
$
\
e
->
do
setupEnvironment
e
void
$
createAliceAndBob
e
describe
"Copying nodes across instances"
$
do
it
"should forbid moving a node the user doesn't own"
$
\
(
SpecContext
testEnv1
server1Port
app1
(
_testEnv2
,
_app2
,
server2Port
))
->
do
withApplication
app1
$
do
withValidLogin
server1Port
"alice"
(
GargPassword
"alice"
)
$
\
aliceClientEnv
aliceToken
->
do
withValidLogin
server2Port
"bob"
(
GargPassword
"bob"
)
$
\
_bobClientEnv
bobToken
->
do
liftIO
$
do
bobPublicFolderId
<-
getRootPublicFolderIdForUser
testEnv1
(
UserName
"bob"
)
let
rq
=
RemoteExportRequest
{
_rer_instance_url
=
fromMaybe
(
panicTrace
"impossible"
)
$
parseBaseUrl
"http://localhost:9008"
,
_rer_instance_auth
=
bobToken
}
res
<-
runClientM
(
remoteExportClient
aliceToken
bobPublicFolderId
rq
)
aliceClientEnv
res
`
shouldFailWith
`
EC_403__policy_check_error
it
"supports trivial transfer between instances"
$
\
(
SpecContext
testEnv1
server1Port
app1
(
_testEnv2
,
_app2
,
server2Port
))
->
do
withApplication
app1
$
do
withValidLogin
server1Port
"alice"
(
GargPassword
"alice"
)
$
\
aliceClientEnv
aliceToken
->
do
corpusId
<-
liftIO
$
newCorpusForUser
testEnv1
"alice"
withValidLogin
server2Port
"bob"
(
GargPassword
"bob"
)
$
\
_bobClientEnv
bobToken
->
do
liftIO
$
do
let
rq
=
RemoteExportRequest
{
_rer_instance_url
=
fromMaybe
(
panicTrace
"impossible"
)
$
parseBaseUrl
"http://localhost:9008"
,
_rer_instance_auth
=
bobToken
}
res
<-
checkEither
$
runClientM
(
remoteExportClient
aliceToken
corpusId
rq
)
aliceClientEnv
res
`
shouldBe
`
[
UnsafeMkNodeId
16
]
-- Certain node types (like private, share, etc) shouldn't be transferred.
it
"forbids transferring certain node types"
$
\
(
SpecContext
testEnv1
server1Port
app1
(
_testEnv2
,
_app2
,
server2Port
))
->
do
withApplication
app1
$
do
withValidLogin
server1Port
"alice"
(
GargPassword
"alice"
)
$
\
aliceClientEnv
aliceToken
->
do
folderId
<-
liftIO
$
newPrivateFolderForUser
testEnv1
"alice"
withValidLogin
server2Port
"bob"
(
GargPassword
"bob"
)
$
\
_bobClientEnv
bobToken
->
do
liftIO
$
do
let
rq
=
RemoteExportRequest
{
_rer_instance_url
=
fromMaybe
(
panicTrace
"impossible"
)
$
parseBaseUrl
"http://localhost:9008"
,
_rer_instance_auth
=
bobToken
}
res
<-
runClientM
(
remoteExportClient
aliceToken
folderId
rq
)
aliceClientEnv
res
`
shouldFailWith
`
EC_403__node_export_error
test/Test/API/Private/Share.hs
View file @
b3a89d3a
...
@@ -18,7 +18,7 @@ import Gargantext.Core.Types.Individu
...
@@ -18,7 +18,7 @@ import Gargantext.Core.Types.Individu
import
Gargantext.Prelude
import
Gargantext.Prelude
import
Prelude
(
fail
)
import
Prelude
(
fail
)
import
Servant.Auth.Client
qualified
as
SC
import
Servant.Auth.Client
qualified
as
SC
import
Servant.Client
import
Servant.Client
.Streaming
import
Test.API.Prelude
(
newCorpusForUser
)
import
Test.API.Prelude
(
newCorpusForUser
)
import
Test.API.Routes
import
Test.API.Routes
import
Test.API.Setup
(
SpecContext
(
..
),
dbEnvSetup
,
withTestDBAndPort
)
import
Test.API.Setup
(
SpecContext
(
..
),
dbEnvSetup
,
withTestDBAndPort
)
...
...
test/Test/API/Private/Table.hs
View file @
b3a89d3a
...
@@ -11,7 +11,7 @@ import Gargantext.Core.Types
...
@@ -11,7 +11,7 @@ import Gargantext.Core.Types
import
Gargantext.Core.Types.Individu
import
Gargantext.Core.Types.Individu
import
Gargantext.Database.Query.Facet
qualified
as
Facet
import
Gargantext.Database.Query.Facet
qualified
as
Facet
import
Gargantext.Prelude
import
Gargantext.Prelude
import
Servant.Client
import
Servant.Client
.Streaming
import
Test.API.Prelude
(
checkEither
)
import
Test.API.Prelude
(
checkEither
)
import
Test.API.Routes
import
Test.API.Routes
import
Test.API.Setup
(
SpecContext
(
..
),
dbEnvSetup
,
withTestDBAndPort
)
import
Test.API.Setup
(
SpecContext
(
..
),
dbEnvSetup
,
withTestDBAndPort
)
...
...
test/Test/API/Routes.hs
View file @
b3a89d3a
...
@@ -37,6 +37,7 @@ import Gargantext.API.Errors
...
@@ -37,6 +37,7 @@ import Gargantext.API.Errors
import
Gargantext.API.HashedResponse
(
HashedResponse
)
import
Gargantext.API.HashedResponse
(
HashedResponse
)
import
Gargantext.API.Ngrams.List.Types
(
WithJsonFile
,
WithTextFile
)
import
Gargantext.API.Ngrams.List.Types
(
WithJsonFile
,
WithTextFile
)
import
Gargantext.API.Ngrams.Types
(
NgramsTable
,
NgramsTablePatch
,
OrderBy
,
TabType
,
Versioned
,
VersionedWithCount
)
import
Gargantext.API.Ngrams.Types
(
NgramsTable
,
NgramsTablePatch
,
OrderBy
,
TabType
,
Versioned
,
VersionedWithCount
)
import
Gargantext.API.Routes.Client
import
Gargantext.API.Routes.Named
import
Gargantext.API.Routes.Named
import
Gargantext.API.Routes.Named.List
(
updateListJSONEp
,
updateListTSVEp
)
import
Gargantext.API.Routes.Named.List
(
updateListJSONEp
,
updateListTSVEp
)
import
Gargantext.API.Routes.Named.Node
hiding
(
treeAPI
)
import
Gargantext.API.Routes.Named.Node
hiding
(
treeAPI
)
...
@@ -55,23 +56,10 @@ import Gargantext.Database.Admin.Types.Hyperdata
...
@@ -55,23 +56,10 @@ import Gargantext.Database.Admin.Types.Hyperdata
import
Gargantext.Database.Query.Facet
qualified
as
Facet
import
Gargantext.Database.Query.Facet
qualified
as
Facet
import
Gargantext.Database.Query.Table.NodeNode
(
SourceId
(
..
),
TargetId
(
..
))
import
Gargantext.Database.Query.Table.NodeNode
(
SourceId
(
..
),
TargetId
(
..
))
import
Gargantext.Prelude
import
Gargantext.Prelude
import
Network.HTTP.Types
qualified
as
H
import
Network.Wai.Handler.Warp
(
Port
)
import
Network.Wai.Handler.Warp
(
Port
)
import
Servant.API.WebSocket
qualified
as
WS
import
Servant.Auth.Client
qualified
as
S
import
Servant.Auth.Client
qualified
as
S
import
Servant.Client
(
ClientM
)
import
Servant.Client.Streaming
import
Servant.Client.Core
(
RunClient
,
HasClient
(
..
),
Request
)
import
Servant.Conduit
()
import
Servant.Client.Generic
(
genericClient
,
AsClientT
)
instance
RunClient
m
=>
HasClient
m
WS
.
WebSocketPending
where
type
Client
m
WS
.
WebSocketPending
=
H
.
Method
->
m
()
clientWithRoute
::
Proxy
m
->
Proxy
WS
.
WebSocketPending
->
Request
->
Client
m
WS
.
WebSocketPending
clientWithRoute
_pm
Proxy
_req
_httpMethod
=
do
panicTrace
"[WebSocket client] this is not implemented!"
hoistClientMonad
_
_
f
cl
=
\
meth
->
f
(
cl
meth
)
-- This is for requests made by http.client directly to hand-crafted URLs.
-- This is for requests made by http.client directly to hand-crafted URLs.
...
@@ -85,12 +73,6 @@ mkUrl _port urlPiece =
...
@@ -85,12 +73,6 @@ mkUrl _port urlPiece =
gqlUrl
::
ByteString
gqlUrl
::
ByteString
gqlUrl
=
"/gql"
gqlUrl
=
"/gql"
-- | The client for the full API. It also serves as a \"proof\" that our
-- whole API has all the required instances to be used in a client.
clientRoutes
::
API
(
AsClientT
ClientM
)
clientRoutes
=
genericClient
-- This is for Servant.Client requests
-- This is for Servant.Client requests
auth_api
::
AuthRequest
->
ClientM
AuthResponse
auth_api
::
AuthRequest
->
ClientM
AuthResponse
auth_api
=
clientRoutes
&
apiWithCustomErrorScheme
auth_api
=
clientRoutes
&
apiWithCustomErrorScheme
...
...
test/Test/API/Setup.hs
View file @
b3a89d3a
...
@@ -9,6 +9,7 @@ module Test.API.Setup (
...
@@ -9,6 +9,7 @@ module Test.API.Setup (
,
setupEnvironment
,
setupEnvironment
,
createAliceAndBob
,
createAliceAndBob
,
dbEnvSetup
,
dbEnvSetup
,
newTestEnv
)
where
)
where
import
Control.Concurrent.Async
qualified
as
Async
import
Control.Concurrent.Async
qualified
as
Async
...
...
test/Test/API/UpdateList.hs
View file @
b3a89d3a
...
@@ -61,7 +61,7 @@ import Gargantext.Database.Query.Facet qualified as Facet
...
@@ -61,7 +61,7 @@ import Gargantext.Database.Query.Facet qualified as Facet
import
Gargantext.Prelude
hiding
(
get
)
import
Gargantext.Prelude
hiding
(
get
)
import
Network.Wai.Handler.Warp
qualified
as
Wai
import
Network.Wai.Handler.Warp
qualified
as
Wai
import
Paths_gargantext
(
getDataFileName
)
import
Paths_gargantext
(
getDataFileName
)
import
Servant.Client
import
Servant.Client
.Streaming
import
System.FilePath
import
System.FilePath
import
Test.API.Prelude
(
checkEither
,
newCorpusForUser
,
newPrivateFolderForUser
)
import
Test.API.Prelude
(
checkEither
,
newCorpusForUser
,
newPrivateFolderForUser
)
import
Test.API.Routes
(
mkUrl
,
gqlUrl
,
get_table_ngrams
,
put_table_ngrams
,
toServantToken
,
clientRoutes
,
get_table
,
update_node
,
add_form_to_list
,
add_tsv_to_list
)
import
Test.API.Routes
(
mkUrl
,
gqlUrl
,
get_table_ngrams
,
put_table_ngrams
,
toServantToken
,
clientRoutes
,
get_table
,
update_node
,
add_form_to_list
,
add_tsv_to_list
)
...
...
test/Test/Instances.hs
View file @
b3a89d3a
...
@@ -506,6 +506,9 @@ genFrontendErr be = do
...
@@ -506,6 +506,9 @@ genFrontendErr be = do
->
do
sId
<-
arbitrary
->
do
sId
<-
arbitrary
tId
<-
arbitrary
tId
<-
arbitrary
pure
$
Errors
.
mkFrontendErr'
txt
$
Errors
.
FE_node_move_error
sId
tId
"generic reason"
pure
$
Errors
.
mkFrontendErr'
txt
$
Errors
.
FE_node_move_error
sId
tId
"generic reason"
Errors
.
EC_403__node_export_error
->
do
nId
<-
arbitrary
pure
$
Errors
.
mkFrontendErr'
txt
$
Errors
.
FE_node_export_error
nId
"generic reason"
-- validation error
-- validation error
Errors
.
EC_400__validation_error
Errors
.
EC_400__validation_error
...
...
test/Test/Ngrams/Terms.hs
View file @
b3a89d3a
{-# OPTIONS_GHC -Wno-incomplete-uni-patterns #-}
{-# OPTIONS_GHC -Wno-incomplete-uni-patterns #-}
{-# LANGUAGE TypeApplications #-}
module
Test.Ngrams.Terms
(
tests
)
where
module
Test.Ngrams.Terms
(
tests
)
where
import
Data.HashMap.Strict
qualified
as
HashMap
import
Data.HashMap.Strict
qualified
as
HashMap
...
@@ -142,18 +140,16 @@ testNgramsByDoc01 = do
...
@@ -142,18 +140,16 @@ testNgramsByDoc01 = do
,
_hd_abstract
=
Nothing
}
,
_hd_abstract
=
Nothing
}
let
ctx2
=
ContextOnlyId
2
hd2
let
ctx2
=
ContextOnlyId
2
hd2
ngramsByDoc
EN
NgramsTerms
terms
[
ctx1
,
ctx2
]
@?=
ngramsByDoc
EN
NgramsTerms
terms
ctx1
@?=
[
HashMap
.
fromList
HashMap
.
fromList
[
(
SimpleNgrams
$
UnsafeNgrams
{
_ngramsTerms
=
"hello"
,
_ngramsSize
=
1
}
[
(
SimpleNgrams
$
UnsafeNgrams
{
_ngramsTerms
=
"hello"
,
_ngramsSize
=
1
}
,
Map
.
singleton
NgramsTerms
$
Map
.
singleton
(
UnsafeMkNodeId
1
)
(
1
,
1
)
)
,
Map
.
singleton
NgramsTerms
$
Map
.
singleton
(
UnsafeMkNodeId
1
)
(
1
,
1
)
)
,
(
SimpleNgrams
$
UnsafeNgrams
{
_ngramsTerms
=
"world"
,
_ngramsSize
=
1
}
,
(
SimpleNgrams
$
UnsafeNgrams
{
_ngramsTerms
=
"world"
,
_ngramsSize
=
1
}
,
Map
.
singleton
NgramsTerms
$
Map
.
singleton
(
UnsafeMkNodeId
1
)
(
1
,
1
)
)
,
Map
.
singleton
NgramsTerms
$
Map
.
singleton
(
UnsafeMkNodeId
1
)
(
1
,
1
)
)
]
]
,
HashMap
.
fromList
ngramsByDoc
EN
NgramsTerms
terms
ctx2
@?=
HashMap
.
fromList
[
(
SimpleNgrams
$
UnsafeNgrams
{
_ngramsTerms
=
"world"
,
_ngramsSize
=
1
}
[
(
SimpleNgrams
$
UnsafeNgrams
{
_ngramsTerms
=
"world"
,
_ngramsSize
=
1
}
,
Map
.
singleton
NgramsTerms
$
Map
.
singleton
(
UnsafeMkNodeId
2
)
(
1
,
2
)
)
,
Map
.
singleton
NgramsTerms
$
Map
.
singleton
(
UnsafeMkNodeId
2
)
(
1
,
2
)
)
]
]
]
ngramsByDoc
EN
NgramsTerms
terms
[
ctx1
,
ctx2
]
@?=
(
ngramsByDoc
EN
NgramsTerms
terms
[
ctx1
])
<>
(
ngramsByDoc
EN
NgramsTerms
terms
[
ctx2
])
test/Test/Offline/JSON.hs
View file @
b3a89d3a
...
@@ -13,6 +13,7 @@ import Gargantext.API.Errors
...
@@ -13,6 +13,7 @@ import Gargantext.API.Errors
import
Gargantext.API.Node.Corpus.Types
import
Gargantext.API.Node.Corpus.Types
import
Gargantext.API.Node.Types
import
Gargantext.API.Node.Types
import
Gargantext.API.Routes.Named.Publish
(
PublishRequest
)
import
Gargantext.API.Routes.Named.Publish
(
PublishRequest
)
import
Gargantext.API.Routes.Named.Remote
import
Gargantext.API.Viz.Types
import
Gargantext.API.Viz.Types
import
Gargantext.Core.Types.Phylo
import
Gargantext.Core.Types.Phylo
import
Gargantext.Core.Viz.Phylo
qualified
as
VizPhylo
import
Gargantext.Core.Viz.Phylo
qualified
as
VizPhylo
...
@@ -56,6 +57,7 @@ tests = testGroup "JSON" [
...
@@ -56,6 +57,7 @@ tests = testGroup "JSON" [
,
testProperty
"Datafield roundtrips"
(
jsonRoundtrip
@
Datafield
)
,
testProperty
"Datafield roundtrips"
(
jsonRoundtrip
@
Datafield
)
,
testProperty
"WithQuery roundtrips"
(
jsonRoundtrip
@
WithQuery
)
,
testProperty
"WithQuery roundtrips"
(
jsonRoundtrip
@
WithQuery
)
,
testProperty
"PublishRequest roundtrips"
(
jsonRoundtrip
@
PublishRequest
)
,
testProperty
"PublishRequest roundtrips"
(
jsonRoundtrip
@
PublishRequest
)
,
testProperty
"RemoteExportRequest roundtrips"
(
jsonRoundtrip
@
RemoteExportRequest
)
,
testProperty
"FrontendError roundtrips"
jsonFrontendErrorRoundtrip
,
testProperty
"FrontendError roundtrips"
jsonFrontendErrorRoundtrip
,
testProperty
"BackendErrorCode roundtrips"
(
jsonEnumRoundtrip
(
Dict
@
_
@
BackendErrorCode
))
,
testProperty
"BackendErrorCode roundtrips"
(
jsonEnumRoundtrip
(
Dict
@
_
@
BackendErrorCode
))
,
testProperty
"NodeType roundtrips"
(
jsonEnumRoundtrip
(
Dict
@
_
@
NodeType
))
,
testProperty
"NodeType roundtrips"
(
jsonEnumRoundtrip
(
Dict
@
_
@
NodeType
))
...
...
test/Test/Server/ReverseProxy.hs
View file @
b3a89d3a
...
@@ -8,7 +8,7 @@ import Network.HTTP.Client
...
@@ -8,7 +8,7 @@ import Network.HTTP.Client
import
Network.HTTP.Types.Status
import
Network.HTTP.Types.Status
import
Prelude
import
Prelude
import
Servant.Auth.Client
(
Token
(
..
))
import
Servant.Auth.Client
(
Token
(
..
))
import
Servant.Client
import
Servant.Client
.Streaming
import
Servant.Client.Generic
(
genericClient
)
import
Servant.Client.Generic
(
genericClient
)
import
Test.API.Setup
(
setupEnvironment
,
withBackendServerAndProxy
,
createAliceAndBob
)
import
Test.API.Setup
(
setupEnvironment
,
withBackendServerAndProxy
,
createAliceAndBob
)
import
Test.Hspec
import
Test.Hspec
...
...
test/Test/Utils.hs
View file @
b3a89d3a
...
@@ -61,7 +61,7 @@ import Network.Wai.Handler.Warp (Port)
...
@@ -61,7 +61,7 @@ import Network.Wai.Handler.Warp (Port)
import
Network.Wai.Test
(
SResponse
(
..
))
import
Network.Wai.Test
(
SResponse
(
..
))
import
Network.WebSockets
qualified
as
WS
import
Network.WebSockets
qualified
as
WS
import
Prelude
qualified
import
Prelude
qualified
import
Servant.Client
(
ClientEnv
,
baseUrlPort
,
mkClientEnv
,
parseBaseUrl
,
runClientM
,
makeClientRequest
,
defaultMakeClientRequest
)
import
Servant.Client
.Streaming
(
ClientEnv
,
baseUrlPort
,
mkClientEnv
,
parseBaseUrl
,
runClientM
,
makeClientRequest
,
defaultMakeClientRequest
)
import
Servant.Client.Core
(
BaseUrl
)
import
Servant.Client.Core
(
BaseUrl
)
import
Servant.Client.Core.Request
qualified
as
Client
import
Servant.Client.Core.Request
qualified
as
Client
import
System.Environment
(
lookupEnv
)
import
System.Environment
(
lookupEnv
)
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment