Skip to content
Projects
Groups
Snippets
Help
Loading...
Help
Submit feedback
Contribute to GitLab
Sign in
Toggle navigation
haskell-gargantext
Project
Project
Details
Activity
Releases
Cycle Analytics
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Charts
Issues
0
Issues
0
List
Board
Labels
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Charts
Wiki
Wiki
Snippets
Snippets
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Charts
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
Julien Moutinho
haskell-gargantext
Commits
7985388f
Commit
7985388f
authored
Nov 07, 2023
by
Alexandre Delanoë
Browse files
Options
Browse Files
Download
Plain Diff
Merge remote-tracking branch 'origin/281-dev-ngrams-fixes' into dev
parents
c71dbb14
99e38af8
Changes
18
Show whitespace changes
Inline
Side-by-side
Showing
18 changed files
with
695 additions
and
287 deletions
+695
-287
gargantext.cabal
gargantext.cabal
+3
-2
EnvTypes.hs
src/Gargantext/API/Admin/EnvTypes.hs
+0
-6
Settings.hs
src/Gargantext/API/Admin/Settings.hs
+2
-2
Dev.hs
src/Gargantext/API/Dev.hs
+2
-2
Ngrams.hs
src/Gargantext/API/Ngrams.hs
+80
-68
List.hs
src/Gargantext/API/Ngrams/List.hs
+5
-1
Tools.hs
src/Gargantext/API/Ngrams/Tools.hs
+23
-25
Types.hs
src/Gargantext/API/Ngrams/Types.hs
+2
-0
NodeStory.hs
src/Gargantext/Core/NodeStory.hs
+194
-156
NodeStoryFile.hs.old
src/Gargantext/Core/NodeStoryFile.hs.old
+1
-1
List.hs
src/Gargantext/Database/Action/Flow/List.hs
+7
-5
Setup.hs
test/Test/API/Setup.hs
+4
-1
Operations.hs
test/Test/Database/Operations.hs
+30
-4
NodeStory.hs
test/Test/Database/Operations/NodeStory.hs
+306
-0
Setup.hs
test/Test/Database/Setup.hs
+7
-1
Types.hs
test/Test/Database/Types.hs
+16
-0
Query.hs
test/Test/Ngrams/Query.hs
+12
-13
Main.hs
test/drivers/hspec/Main.hs
+1
-0
No files found.
gargantext.cabal
View file @
7985388f
...
@@ -130,6 +130,7 @@ library
...
@@ -130,6 +130,7 @@ library
Gargantext.Database.Admin.Types.Node
Gargantext.Database.Admin.Types.Node
Gargantext.Database.Prelude
Gargantext.Database.Prelude
Gargantext.Database.Query.Facet
Gargantext.Database.Query.Facet
Gargantext.Database.Query.Table.Ngrams
Gargantext.Database.Query.Table.NgramsPostag
Gargantext.Database.Query.Table.NgramsPostag
Gargantext.Database.Query.Table.Node
Gargantext.Database.Query.Table.Node
Gargantext.Database.Query.Table.Node.Error
Gargantext.Database.Query.Table.Node.Error
...
@@ -216,7 +217,6 @@ library
...
@@ -216,7 +217,6 @@ library
Gargantext.Core.Methods.Similarities.Accelerate.SpeGen
Gargantext.Core.Methods.Similarities.Accelerate.SpeGen
Gargantext.Core.Methods.Similarities.Conditional
Gargantext.Core.Methods.Similarities.Conditional
Gargantext.Core.Methods.Similarities.Distributional
Gargantext.Core.Methods.Similarities.Distributional
Gargantext.Core.NodeStoryFile
Gargantext.Core.Statistics
Gargantext.Core.Statistics
Gargantext.Core.Text.Convert
Gargantext.Core.Text.Convert
Gargantext.Core.Text.Corpus.API.Hal
Gargantext.Core.Text.Corpus.API.Hal
...
@@ -329,7 +329,6 @@ library
...
@@ -329,7 +329,6 @@ library
Gargantext.Database.Query.Table.Context
Gargantext.Database.Query.Table.Context
Gargantext.Database.Query.Table.ContextNodeNgrams
Gargantext.Database.Query.Table.ContextNodeNgrams
Gargantext.Database.Query.Table.ContextNodeNgrams2
Gargantext.Database.Query.Table.ContextNodeNgrams2
Gargantext.Database.Query.Table.Ngrams
Gargantext.Database.Query.Table.Node.Children
Gargantext.Database.Query.Table.Node.Children
Gargantext.Database.Query.Table.Node.Contact
Gargantext.Database.Query.Table.Node.Contact
Gargantext.Database.Query.Table.Node.Document.Add
Gargantext.Database.Query.Table.Node.Document.Add
...
@@ -892,6 +891,7 @@ test-suite garg-test-tasty
...
@@ -892,6 +891,7 @@ test-suite garg-test-tasty
Test.Core.Utils
Test.Core.Utils
Test.Database.Operations
Test.Database.Operations
Test.Database.Operations.DocumentSearch
Test.Database.Operations.DocumentSearch
Test.Database.Operations.NodeStory
Test.Database.Setup
Test.Database.Setup
Test.Database.Types
Test.Database.Types
Test.Graph.Clustering
Test.Graph.Clustering
...
@@ -1002,6 +1002,7 @@ test-suite garg-test-hspec
...
@@ -1002,6 +1002,7 @@ test-suite garg-test-hspec
Test.API.Setup
Test.API.Setup
Test.Database.Operations
Test.Database.Operations
Test.Database.Operations.DocumentSearch
Test.Database.Operations.DocumentSearch
Test.Database.Operations.NodeStory
Test.Database.Setup
Test.Database.Setup
Test.Database.Types
Test.Database.Types
Test.Utils
Test.Utils
...
...
src/Gargantext/API/Admin/EnvTypes.hs
View file @
7985388f
...
@@ -138,9 +138,6 @@ instance HasNodeStoryEnv Env where
...
@@ -138,9 +138,6 @@ instance HasNodeStoryEnv Env where
instance
HasNodeStoryVar
Env
where
instance
HasNodeStoryVar
Env
where
hasNodeStoryVar
=
hasNodeStory
.
nse_getter
hasNodeStoryVar
=
hasNodeStory
.
nse_getter
instance
HasNodeStorySaver
Env
where
hasNodeStorySaver
=
hasNodeStory
.
nse_saver
instance
HasNodeStoryImmediateSaver
Env
where
instance
HasNodeStoryImmediateSaver
Env
where
hasNodeStoryImmediateSaver
=
hasNodeStory
.
nse_saver_immediate
hasNodeStoryImmediateSaver
=
hasNodeStory
.
nse_saver_immediate
...
@@ -310,9 +307,6 @@ instance HasNodeStoryEnv DevEnv where
...
@@ -310,9 +307,6 @@ instance HasNodeStoryEnv DevEnv where
instance
HasNodeStoryVar
DevEnv
where
instance
HasNodeStoryVar
DevEnv
where
hasNodeStoryVar
=
hasNodeStory
.
nse_getter
hasNodeStoryVar
=
hasNodeStory
.
nse_getter
instance
HasNodeStorySaver
DevEnv
where
hasNodeStorySaver
=
hasNodeStory
.
nse_saver
instance
HasNodeStoryImmediateSaver
DevEnv
where
instance
HasNodeStoryImmediateSaver
DevEnv
where
hasNodeStoryImmediateSaver
=
hasNodeStory
.
nse_saver_immediate
hasNodeStoryImmediateSaver
=
hasNodeStory
.
nse_saver_immediate
...
...
src/Gargantext/API/Admin/Settings.hs
View file @
7985388f
...
@@ -186,8 +186,8 @@ newEnv logger port file = do
...
@@ -186,8 +186,8 @@ newEnv logger port file = do
!
self_url_env
<-
parseBaseUrl
$
"http://0.0.0.0:"
<>
show
port
!
self_url_env
<-
parseBaseUrl
$
"http://0.0.0.0:"
<>
show
port
dbParam
<-
databaseParameters
file
dbParam
<-
databaseParameters
file
!
pool
<-
newPool
dbParam
!
pool
<-
newPool
dbParam
--nodeStory_env <-
read
NodeStoryEnv (_gc_repofilepath config_env)
--nodeStory_env <-
fromDB
NodeStoryEnv (_gc_repofilepath config_env)
!
nodeStory_env
<-
read
NodeStoryEnv
pool
!
nodeStory_env
<-
fromDB
NodeStoryEnv
pool
!
scrapers_env
<-
newJobEnv
defaultSettings
manager_env
!
scrapers_env
<-
newJobEnv
defaultSettings
manager_env
secret
<-
Jobs
.
genSecret
secret
<-
Jobs
.
genSecret
...
...
src/Gargantext/API/Dev.hs
View file @
7985388f
...
@@ -38,9 +38,9 @@ withDevEnv iniPath k = withLoggerHoisted Dev $ \logger -> do
...
@@ -38,9 +38,9 @@ withDevEnv iniPath k = withLoggerHoisted Dev $ \logger -> do
newDevEnv
logger
=
do
newDevEnv
logger
=
do
cfg
<-
readConfig
iniPath
cfg
<-
readConfig
iniPath
dbParam
<-
databaseParameters
iniPath
dbParam
<-
databaseParameters
iniPath
--nodeStory_env <-
read
NodeStoryEnv (_gc_repofilepath cfg)
--nodeStory_env <-
fromDB
NodeStoryEnv (_gc_repofilepath cfg)
pool
<-
newPool
dbParam
pool
<-
newPool
dbParam
nodeStory_env
<-
read
NodeStoryEnv
pool
nodeStory_env
<-
fromDB
NodeStoryEnv
pool
setts
<-
devSettings
devJwkFile
setts
<-
devSettings
devJwkFile
mail
<-
Mail
.
readConfig
iniPath
mail
<-
Mail
.
readConfig
iniPath
nlp_config
<-
NLP
.
readConfig
iniPath
nlp_config
<-
NLP
.
readConfig
iniPath
...
...
src/Gargantext/API/Ngrams.hs
View file @
7985388f
...
@@ -87,8 +87,7 @@ module Gargantext.API.Ngrams
...
@@ -87,8 +87,7 @@ module Gargantext.API.Ngrams
)
)
where
where
import
Control.Concurrent
import
Control.Lens
((
.~
),
view
,
(
^.
),
(
^..
),
(
+~
),
(
%~
),
(
.~
),
msumOf
,
at
,
_Just
,
Each
(
..
),
(
%%~
),
mapped
,
non
,
ifolded
,
to
,
withIndex
,
over
)
import
Control.Lens
((
.~
),
view
,
(
^.
),
(
^..
),
(
+~
),
(
%~
),
(
.~
),
msumOf
,
at
,
_Just
,
Each
(
..
),
(
%%~
),
mapped
,
ifolded
,
to
,
withIndex
,
over
)
import
Control.Monad.Reader
import
Control.Monad.Reader
import
Data.Aeson.Text
qualified
as
DAT
import
Data.Aeson.Text
qualified
as
DAT
import
Data.Foldable
import
Data.Foldable
...
@@ -123,6 +122,7 @@ import Gargantext.Database.Schema.Node (node_id, node_parent_id, node_user_id)
...
@@ -123,6 +122,7 @@ import Gargantext.Database.Schema.Node (node_id, node_parent_id, node_user_id)
import
Gargantext.Prelude
hiding
(
log
,
to
,
toLower
,
(
%
))
import
Gargantext.Prelude
hiding
(
log
,
to
,
toLower
,
(
%
))
import
Gargantext.Prelude.Clock
(
hasTime
,
getTime
)
import
Gargantext.Prelude.Clock
(
hasTime
,
getTime
)
import
Gargantext.Utils.Jobs
(
serveJobsAPI
,
MonadJobStatus
(
..
))
import
Gargantext.Utils.Jobs
(
serveJobsAPI
,
MonadJobStatus
(
..
))
import
GHC.Conc
(
readTVar
,
writeTVar
)
import
Prelude
(
error
)
import
Prelude
(
error
)
import
Servant
hiding
(
Patch
)
import
Servant
hiding
(
Patch
)
...
@@ -173,10 +173,10 @@ mkChildrenGroups addOrRem nt patches =
...
@@ -173,10 +173,10 @@ mkChildrenGroups addOrRem nt patches =
------------------------------------------------------------------------
------------------------------------------------------------------------
saveNodeStory
::
(
MonadReader
env
m
,
MonadBase
IO
m
,
HasNodeStorySaver
env
)
saveNodeStory
::
(
MonadReader
env
m
,
MonadBase
IO
m
,
HasNodeStory
Immediate
Saver
env
)
=>
m
()
=>
m
()
saveNodeStory
=
do
saveNodeStory
=
do
saver
<-
view
hasNodeStorySaver
saver
<-
view
hasNodeStory
Immediate
Saver
liftBase
$
do
liftBase
$
do
--Gargantext.Prelude.putStrLn "---- Running node story saver ----"
--Gargantext.Prelude.putStrLn "---- Running node story saver ----"
saver
saver
...
@@ -249,7 +249,6 @@ addListNgrams listId ngramsType nes = do
...
@@ -249,7 +249,6 @@ addListNgrams listId ngramsType nes = do
-- | TODO: incr the Version number
-- | TODO: incr the Version number
-- && should use patch
-- && should use patch
-- UNSAFE
-- UNSAFE
setListNgrams
::
HasNodeStory
env
err
m
setListNgrams
::
HasNodeStory
env
err
m
=>
NodeId
=>
NodeId
->
TableNgrams
.
NgramsType
->
TableNgrams
.
NgramsType
...
@@ -257,15 +256,18 @@ setListNgrams :: HasNodeStory env err m
...
@@ -257,15 +256,18 @@ setListNgrams :: HasNodeStory env err m
->
m
()
->
m
()
setListNgrams
listId
ngramsType
ns
=
do
setListNgrams
listId
ngramsType
ns
=
do
-- printDebug "[setListNgrams]" (listId, ngramsType)
-- printDebug "[setListNgrams]" (listId, ngramsType)
getter
<-
view
hasNodeStory
var
<-
getNodeStoryVar
[
listId
]
var
<-
liftBase
$
(
getter
^.
nse_getter
)
[
listId
]
liftBase
$
atomically
$
do
liftBase
$
modifyMVar_
var
$
nls
<-
readTVar
var
pure
.
(
unNodeStory
writeTVar
var
$
(
unNodeStory
.
at
listId
.
_Just
.
at
listId
.
_Just
.
a_state
.
a_state
.
at
ngramsType
.
at
ngramsType
.~
Just
ns
%~
(
\
mns'
->
case
mns'
of
)
Nothing
->
Just
ns
Just
ns'
->
Just
$
ns
<>
ns'
)
)
nls
saveNodeStory
saveNodeStory
...
@@ -292,9 +294,9 @@ commitStatePatch listId (Versioned _p_version p) = do
...
@@ -292,9 +294,9 @@ commitStatePatch listId (Versioned _p_version p) = do
-- printDebug "[commitStatePatch]" listId
-- printDebug "[commitStatePatch]" listId
var
<-
getNodeStoryVar
[
listId
]
var
<-
getNodeStoryVar
[
listId
]
archiveSaver
<-
view
hasNodeArchiveStoryImmediateSaver
archiveSaver
<-
view
hasNodeArchiveStoryImmediateSaver
vq'
<-
liftBase
$
modifyMVar
var
$
\
ns
->
do
ns
<-
liftBase
$
atomically
$
readTVar
var
let
let
a
=
ns
^.
unNodeStory
.
at
listId
.
_Just
a
=
ns
^.
unNodeStory
.
at
listId
.
non
initArchive
-- apply patches from version p_version to a ^. a_version
-- apply patches from version p_version to a ^. a_version
-- TODO Check this
-- TODO Check this
--q = mconcat $ take (a ^. a_version - p_version) (a ^. a_history)
--q = mconcat $ take (a ^. a_version - p_version) (a ^. a_history)
...
@@ -334,15 +336,25 @@ commitStatePatch listId (Versioned _p_version p) = do
...
@@ -334,15 +336,25 @@ commitStatePatch listId (Versioned _p_version p) = do
-- snapshot. Node Story archive is a linear table, so it's only
-- snapshot. Node Story archive is a linear table, so it's only
-- couple of inserts, it shouldn't take long...
-- couple of inserts, it shouldn't take long...
-- NOTE This is changed now. Before we used MVar's, now it's TVars
-- (MVar's blocked). It was wrapped in withMVar before, now we read
-- the TVar, modify archive with archiveSaver, then write the TVar.
-- pure (newNs', snd newNs)
-- writeTVar var newNs'
--pure newNs
-- If we postponed saving the archive to the debounce action, we
-- If we postponed saving the archive to the debounce action, we
-- would have issues like
-- would have issues like
-- https://gitlab.iscpif.fr/gargantext/purescript-gargantext/issues/476
-- https://gitlab.iscpif.fr/gargantext/purescript-gargantext/issues/476
-- where the `q` computation from above (which uses the archive)
-- where the `q` computation from above (which uses the archive)
-- would cause incorrect patch application (before the previous
-- would cause incorrect patch application (before the previous
-- archive was saved and applied)
-- archive was saved and applied)
-- newNs' <- archiveSaver $ fst newNs
liftBase
$
do
newNs'
<-
archiveSaver
$
fst
newNs
newNs'
<-
archiveSaver
$
fst
newNs
atomically
$
writeTVar
var
newNs'
pure
(
newNs'
,
snd
newNs
)
-- Save new ngrams
-- Save new ngrams
_
<-
insertNgrams
(
newNgramsFromNgramsStatePatch
p
)
_
<-
insertNgrams
(
newNgramsFromNgramsStatePatch
p
)
...
@@ -350,7 +362,7 @@ commitStatePatch listId (Versioned _p_version p) = do
...
@@ -350,7 +362,7 @@ commitStatePatch listId (Versioned _p_version p) = do
-- saveNodeStory
-- saveNodeStory
saveNodeStoryImmediate
saveNodeStoryImmediate
pure
vq'
pure
$
snd
newNs
...
@@ -363,10 +375,10 @@ tableNgramsPull :: HasNodeStory env err m
...
@@ -363,10 +375,10 @@ tableNgramsPull :: HasNodeStory env err m
tableNgramsPull
listId
ngramsType
p_version
=
do
tableNgramsPull
listId
ngramsType
p_version
=
do
-- printDebug "[tableNgramsPull]" (listId, ngramsType)
-- printDebug "[tableNgramsPull]" (listId, ngramsType)
var
<-
getNodeStoryVar
[
listId
]
var
<-
getNodeStoryVar
[
listId
]
r
<-
liftBase
$
readM
Var
var
r
<-
liftBase
$
atomically
$
readT
Var
var
let
let
a
=
r
^.
unNodeStory
.
at
listId
.
_Just
a
=
r
^.
unNodeStory
.
at
listId
.
non
initArchive
q
=
mconcat
$
take
(
a
^.
a_version
-
p_version
)
(
a
^.
a_history
)
q
=
mconcat
$
take
(
a
^.
a_version
-
p_version
)
(
a
^.
a_history
)
q_table
=
q
^.
_PatchMap
.
at
ngramsType
.
_Just
q_table
=
q
^.
_PatchMap
.
at
ngramsType
.
_Just
...
@@ -491,7 +503,7 @@ getNgramsTableMap :: HasNodeStory env err m
...
@@ -491,7 +503,7 @@ getNgramsTableMap :: HasNodeStory env err m
->
m
(
Versioned
NgramsTableMap
)
->
m
(
Versioned
NgramsTableMap
)
getNgramsTableMap
nodeId
ngramsType
=
do
getNgramsTableMap
nodeId
ngramsType
=
do
v
<-
getNodeStoryVar
[
nodeId
]
v
<-
getNodeStoryVar
[
nodeId
]
repo
<-
liftBase
$
readM
Var
v
repo
<-
liftBase
$
atomically
$
readT
Var
v
pure
$
Versioned
(
repo
^.
unNodeStory
.
at
nodeId
.
_Just
.
a_version
)
pure
$
Versioned
(
repo
^.
unNodeStory
.
at
nodeId
.
_Just
.
a_version
)
(
repo
^.
unNodeStory
.
at
nodeId
.
_Just
.
a_state
.
at
ngramsType
.
_Just
)
(
repo
^.
unNodeStory
.
at
nodeId
.
_Just
.
a_state
.
at
ngramsType
.
_Just
)
...
...
src/Gargantext/API/Ngrams/List.hs
View file @
7985388f
...
@@ -122,7 +122,11 @@ setList :: HasNodeStory env err m
...
@@ -122,7 +122,11 @@ setList :: HasNodeStory env err m
setList
l
m
=
do
setList
l
m
=
do
-- TODO check with Version for optim
-- TODO check with Version for optim
-- printDebug "New list as file" l
-- printDebug "New list as file" l
_
<-
mapM
(
\
(
nt
,
Versioned
_v
ns
)
->
setListNgrams
l
nt
ns
)
$
toList
m
_
<-
mapM
(
\
(
nt
,
Versioned
_v
ns
)
->
(
setListNgrams
l
nt
ns
))
$
toList
m
-- v <- getNodeStoryVar [l]
-- liftBase $ do
-- ns <- atomically $ readTVar v
-- printDebug "[setList] node story: " ns
-- TODO reindex
-- TODO reindex
pure
True
pure
True
...
...
src/Gargantext/API/Ngrams/Tools.hs
View file @
7985388f
...
@@ -14,22 +14,20 @@ Portability : POSIX
...
@@ -14,22 +14,20 @@ Portability : POSIX
module
Gargantext.API.Ngrams.Tools
module
Gargantext.API.Ngrams.Tools
where
where
import
Control.Concurrent
import
Control.Lens
(
_Just
,
(
^.
),
at
,
view
,
At
,
Index
,
IxValue
)
import
Control.Lens
(
_Just
,
(
^.
),
at
,
view
,
At
,
Index
,
IxValue
)
import
Control.Monad.Reader
import
Control.Monad.Reader
import
Data.HashMap.Strict
(
HashMap
)
import
Data.HashMap.Strict
(
HashMap
)
import
Data.HashMap.Strict
qualified
as
HM
import
Data.HashMap.Strict
qualified
as
HM
import
Data.Map.Strict
qualified
as
Map
import
Data.Map.Strict
qualified
as
Map
import
Data.Pool
(
withResource
)
import
Data.Set
qualified
as
Set
import
Data.Set
qualified
as
Set
import
Data.Validity
import
Data.Validity
import
Gargantext.API.Ngrams.Types
import
Gargantext.API.Ngrams.Types
import
Gargantext.Core.NodeStory
import
Gargantext.Core.NodeStory
import
Gargantext.Core.NodeStoryFile
qualified
as
NSF
-- import Gargantext.Core.NodeStoryFile qualified as NSF
import
Gargantext.Core.Types
(
ListType
(
..
),
NodeId
,
NodeType
(
..
),
ListId
)
import
Gargantext.Core.Types
(
ListType
(
..
),
NodeId
,
ListId
)
import
Gargantext.Database.Prelude
(
HasConnectionPool
(
..
))
import
Gargantext.Database.Schema.Ngrams
(
NgramsType
)
import
Gargantext.Database.Schema.Ngrams
(
NgramsType
)
import
Gargantext.Prelude
import
Gargantext.Prelude
import
GHC.Conc
(
TVar
,
readTVar
)
mergeNgramsElement
::
NgramsRepoElement
->
NgramsRepoElement
->
NgramsRepoElement
mergeNgramsElement
::
NgramsRepoElement
->
NgramsRepoElement
->
NgramsRepoElement
...
@@ -43,7 +41,7 @@ getRepo :: HasNodeStory env err m
...
@@ -43,7 +41,7 @@ getRepo :: HasNodeStory env err m
getRepo
listIds
=
do
getRepo
listIds
=
do
f
<-
getNodeListStory
f
<-
getNodeListStory
v
<-
liftBase
$
f
listIds
v
<-
liftBase
$
f
listIds
v'
<-
liftBase
$
readM
Var
v
v'
<-
liftBase
$
atomically
$
readT
Var
v
pure
$
v'
pure
$
v'
...
@@ -58,7 +56,7 @@ repoSize repo node_id = Map.map Map.size state'
...
@@ -58,7 +56,7 @@ repoSize repo node_id = Map.map Map.size state'
getNodeStoryVar
::
HasNodeStory
env
err
m
getNodeStoryVar
::
HasNodeStory
env
err
m
=>
[
ListId
]
->
m
(
M
Var
NodeListStory
)
=>
[
ListId
]
->
m
(
T
Var
NodeListStory
)
getNodeStoryVar
l
=
do
getNodeStoryVar
l
=
do
f
<-
getNodeListStory
f
<-
getNodeListStory
v
<-
liftBase
$
f
l
v
<-
liftBase
$
f
l
...
@@ -66,7 +64,7 @@ getNodeStoryVar l = do
...
@@ -66,7 +64,7 @@ getNodeStoryVar l = do
getNodeListStory
::
HasNodeStory
env
err
m
getNodeListStory
::
HasNodeStory
env
err
m
=>
m
([
NodeId
]
->
IO
(
M
Var
NodeListStory
))
=>
m
([
NodeId
]
->
IO
(
T
Var
NodeListStory
))
getNodeListStory
=
do
getNodeListStory
=
do
env
<-
view
hasNodeStory
env
<-
view
hasNodeStory
pure
$
view
nse_getter
env
pure
$
view
nse_getter
env
...
@@ -228,20 +226,20 @@ getCoocByNgrams'' (Diagonal diag) (f1,f2) (m1,m2) =
...
@@ -228,20 +226,20 @@ getCoocByNgrams'' (Diagonal diag) (f1,f2) (m1,m2) =
------------------------------------------
------------------------------------------
migrateFromDirToDb
::
(
HasNodeStory
env
err
m
)
-- , HasNodeStory env err m)
--
migrateFromDirToDb :: (HasNodeStory env err m) -- , HasNodeStory env err m)
=>
m
()
--
=> m ()
migrateFromDirToDb
=
do
--
migrateFromDirToDb = do
pool
<-
view
connPool
--
pool <- view connPool
withResource
pool
$
\
c
->
do
--
withResource pool $ \c -> do
listIds
<-
liftBase
$
getNodesIdWithType
c
NodeList
--
listIds <- liftBase $ getNodesIdWithType c NodeList
-- printDebug "[migrateFromDirToDb] listIds" listIds
--
-- printDebug "[migrateFromDirToDb] listIds" listIds
(
NodeStory
nls
)
<-
NSF
.
getRepoReadConfig
listIds
--
(NodeStory nls) <- NSF.getRepoReadConfig listIds
-- printDebug "[migrateFromDirToDb] nls" nls
--
-- printDebug "[migrateFromDirToDb] nls" nls
_
<-
mapM
(
\
(
nId
,
a
)
->
do
--
_ <- mapM (\(nId, a) -> do
n
<-
liftBase
$
nodeExists
c
nId
--
n <- liftBase $ nodeExists c nId
case
n
of
--
case n of
False
->
pure
()
--
False -> pure ()
True
->
liftBase
$
upsertNodeStories
c
nId
a
--
True -> liftBase $ upsertNodeStories c nId a
)
$
Map
.
toList
nls
--
) $ Map.toList nls
--
_
<-
nodeStoryIncs
(
Just
$
NodeStory
nls
)
listIds
--
--_ <- nodeStoryIncs (Just $ NodeStory nls) listIds
pure
()
--
pure ()
src/Gargantext/API/Ngrams/Types.hs
View file @
7985388f
...
@@ -147,7 +147,9 @@ makeLenses ''RootParent
...
@@ -147,7 +147,9 @@ makeLenses ''RootParent
data
NgramsRepoElement
=
NgramsRepoElement
data
NgramsRepoElement
=
NgramsRepoElement
{
_nre_size
::
!
Int
{
_nre_size
::
!
Int
,
_nre_list
::
!
ListType
,
_nre_list
::
!
ListType
-- root is the top-most parent of ngrams
,
_nre_root
::
!
(
Maybe
NgramsTerm
)
,
_nre_root
::
!
(
Maybe
NgramsTerm
)
-- parent is the direct parent of this ngram
,
_nre_parent
::
!
(
Maybe
NgramsTerm
)
,
_nre_parent
::
!
(
Maybe
NgramsTerm
)
,
_nre_children
::
!
(
MSet
NgramsTerm
)
,
_nre_children
::
!
(
MSet
NgramsTerm
)
}
}
...
...
src/Gargantext/Core/NodeStory.hs
View file @
7985388f
...
@@ -52,8 +52,6 @@ module Gargantext.Core.NodeStory
...
@@ -52,8 +52,6 @@ module Gargantext.Core.NodeStory
,
hasNodeStory
,
hasNodeStory
,
HasNodeStoryVar
,
HasNodeStoryVar
,
hasNodeStoryVar
,
hasNodeStoryVar
,
HasNodeStorySaver
,
hasNodeStorySaver
,
HasNodeStoryImmediateSaver
,
HasNodeStoryImmediateSaver
,
hasNodeStoryImmediateSaver
,
hasNodeStoryImmediateSaver
,
HasNodeArchiveStoryImmediateSaver
,
HasNodeArchiveStoryImmediateSaver
...
@@ -61,11 +59,11 @@ module Gargantext.Core.NodeStory
...
@@ -61,11 +59,11 @@ module Gargantext.Core.NodeStory
,
NodeStory
(
..
)
,
NodeStory
(
..
)
,
NgramsStatePatch
'
,
NgramsStatePatch
'
,
NodeListStory
,
NodeListStory
,
ArchiveList
,
initNodeListStoryMock
,
initNodeListStoryMock
,
NodeStoryEnv
(
..
)
,
NodeStoryEnv
(
..
)
,
initNodeStory
,
initNodeStory
,
nse_getter
,
nse_getter
,
nse_saver
,
nse_saver_immediate
,
nse_saver_immediate
,
nse_archive_saver_immediate
,
nse_archive_saver_immediate
,
nse_var
,
nse_var
...
@@ -73,6 +71,8 @@ module Gargantext.Core.NodeStory
...
@@ -73,6 +71,8 @@ module Gargantext.Core.NodeStory
,
getNodesArchiveHistory
,
getNodesArchiveHistory
,
Archive
(
..
)
,
Archive
(
..
)
,
initArchive
,
initArchive
,
archiveAdvance
,
unionArchives
,
a_history
,
a_history
,
a_state
,
a_state
,
a_version
,
a_version
...
@@ -82,7 +82,7 @@ module Gargantext.Core.NodeStory
...
@@ -82,7 +82,7 @@ module Gargantext.Core.NodeStory
,
runPGSAdvisoryUnlock
,
runPGSAdvisoryUnlock
,
runPGSAdvisoryXactLock
,
runPGSAdvisoryXactLock
,
getNodesIdWithType
,
getNodesIdWithType
,
read
NodeStoryEnv
,
fromDB
NodeStoryEnv
,
upsertNodeStories
,
upsertNodeStories
,
getNodeStory
,
getNodeStory
,
nodeStoriesQuery
,
nodeStoriesQuery
...
@@ -93,9 +93,8 @@ module Gargantext.Core.NodeStory
...
@@ -93,9 +93,8 @@ module Gargantext.Core.NodeStory
where
where
import
Codec.Serialise.Class
import
Codec.Serialise.Class
import
Control.Debounce
(
mkDebounce
,
defaultDebounceSettings
,
debounceFreq
,
debounceAction
)
import
Control.Exception
(
throw
)
import
Control.Exception
(
throw
)
import
Control.Lens
(
makeLenses
,
Getter
,
(
^.
),
(
.~
),
(
%~
),
_Just
,
at
,
view
)
import
Control.Lens
(
makeLenses
,
Getter
,
(
^.
),
(
.~
),
(
%~
),
non
,
_Just
,
at
,
view
)
import
Control.Monad.Except
import
Control.Monad.Except
import
Control.Monad.Reader
import
Control.Monad.Reader
import
Data.Aeson
hiding
((
.=
),
decode
)
import
Data.Aeson
hiding
((
.=
),
decode
)
...
@@ -112,7 +111,9 @@ import Data.Text qualified as Text
...
@@ -112,7 +111,9 @@ import Data.Text qualified as Text
import
Database.PostgreSQL.Simple
qualified
as
PGS
import
Database.PostgreSQL.Simple
qualified
as
PGS
import
Database.PostgreSQL.Simple.FromField
(
FromField
(
fromField
),
fromJSONField
)
import
Database.PostgreSQL.Simple.FromField
(
FromField
(
fromField
),
fromJSONField
)
import
Database.PostgreSQL.Simple.SqlQQ
(
sql
)
import
Database.PostgreSQL.Simple.SqlQQ
(
sql
)
import
Database.PostgreSQL.Simple.ToField
qualified
as
PGS
import
Database.PostgreSQL.Simple.Types
(
Values
(
..
),
QualifiedIdentifier
(
..
))
import
Database.PostgreSQL.Simple.Types
(
Values
(
..
),
QualifiedIdentifier
(
..
))
import
GHC.Conc
(
TVar
,
newTVar
,
readTVar
,
writeTVar
)
import
Gargantext.API.Ngrams.Types
import
Gargantext.API.Ngrams.Types
import
Gargantext.Core.Types
(
ListId
,
NodeId
(
..
),
NodeType
)
import
Gargantext.Core.Types
(
ListId
,
NodeId
(
..
),
NodeType
)
import
Gargantext.Core.Utils.Prefix
(
unPrefix
)
import
Gargantext.Core.Utils.Prefix
(
unPrefix
)
...
@@ -121,17 +122,15 @@ import Gargantext.Database.Prelude (DbCmd', HasConnectionPool(..))
...
@@ -121,17 +122,15 @@ import Gargantext.Database.Prelude (DbCmd', HasConnectionPool(..))
import
Gargantext.Database.Query.Table.Ngrams
qualified
as
TableNgrams
import
Gargantext.Database.Query.Table.Ngrams
qualified
as
TableNgrams
import
Gargantext.Database.Query.Table.Node.Error
(
HasNodeError
())
import
Gargantext.Database.Query.Table.Node.Error
(
HasNodeError
())
import
Gargantext.Database.Schema.Ngrams
(
NgramsType
)
import
Gargantext.Database.Schema.Ngrams
(
NgramsType
)
import
Gargantext.Prelude
import
Gargantext.Prelude
hiding
(
to
)
import
Opaleye
(
DefaultFromField
(
..
),
SqlJsonb
,
fromPGSFromField
)
import
Opaleye
(
DefaultFromField
(
..
),
SqlJsonb
,
fromPGSFromField
)
import
qualified
Database.PostgreSQL.Simple.ToField
as
PGS
------------------------------------------------------------------------
------------------------------------------------------------------------
data
NodeStoryEnv
=
NodeStoryEnv
data
NodeStoryEnv
=
NodeStoryEnv
{
_nse_var
::
!
(
MVar
NodeListStory
)
{
_nse_var
::
!
(
TVar
NodeListStory
)
,
_nse_saver
::
!
(
IO
()
)
,
_nse_saver_immediate
::
!
(
IO
()
)
,
_nse_saver_immediate
::
!
(
IO
()
)
,
_nse_archive_saver_immediate
::
!
(
NodeListStory
->
IO
NodeListStory
)
,
_nse_archive_saver_immediate
::
!
(
NodeListStory
->
IO
NodeListStory
)
,
_nse_getter
::
!
([
NodeId
]
->
IO
(
M
Var
NodeListStory
))
,
_nse_getter
::
!
([
NodeId
]
->
IO
(
T
Var
NodeListStory
))
--, _nse_cleaner :: !(IO ()) -- every 12 hours: cleans the repos of unused NodeStories
--, _nse_cleaner :: !(IO ()) -- every 12 hours: cleans the repos of unused NodeStories
-- , _nse_lock :: !FileLock -- TODO (it depends on the option: if with database or file only)
-- , _nse_lock :: !FileLock -- TODO (it depends on the option: if with database or file only)
}
}
...
@@ -144,15 +143,12 @@ type HasNodeStory env err m = ( DbCmd' env err m
...
@@ -144,15 +143,12 @@ type HasNodeStory env err m = ( DbCmd' env err m
,
HasNodeError
err
,
HasNodeError
err
)
)
class
(
HasNodeStoryVar
env
,
HasNodeStorySaver
env
)
class
(
HasNodeStoryVar
env
,
HasNodeStory
Immediate
Saver
env
)
=>
HasNodeStoryEnv
env
where
=>
HasNodeStoryEnv
env
where
hasNodeStory
::
Getter
env
NodeStoryEnv
hasNodeStory
::
Getter
env
NodeStoryEnv
class
HasNodeStoryVar
env
where
class
HasNodeStoryVar
env
where
hasNodeStoryVar
::
Getter
env
([
NodeId
]
->
IO
(
MVar
NodeListStory
))
hasNodeStoryVar
::
Getter
env
([
NodeId
]
->
IO
(
TVar
NodeListStory
))
class
HasNodeStorySaver
env
where
hasNodeStorySaver
::
Getter
env
(
IO
()
)
class
HasNodeStoryImmediateSaver
env
where
class
HasNodeStoryImmediateSaver
env
where
hasNodeStoryImmediateSaver
::
Getter
env
(
IO
()
)
hasNodeStoryImmediateSaver
::
Getter
env
(
IO
()
)
...
@@ -167,7 +163,7 @@ class HasNodeArchiveStoryImmediateSaver env where
...
@@ -167,7 +163,7 @@ class HasNodeArchiveStoryImmediateSaver env where
is implemented already
is implemented already
-}
-}
newtype
NodeStory
s
p
=
NodeStory
{
_unNodeStory
::
Map
NodeId
(
Archive
s
p
)
}
newtype
NodeStory
s
p
=
NodeStory
{
_unNodeStory
::
Map
NodeId
(
Archive
s
p
)
}
deriving
(
Generic
,
Show
)
deriving
(
Generic
,
Show
,
Eq
)
instance
(
FromJSON
s
,
FromJSON
p
)
=>
FromJSON
(
NodeStory
s
p
)
instance
(
FromJSON
s
,
FromJSON
p
)
=>
FromJSON
(
NodeStory
s
p
)
instance
(
ToJSON
s
,
ToJSON
p
)
=>
ToJSON
(
NodeStory
s
p
)
instance
(
ToJSON
s
,
ToJSON
p
)
=>
ToJSON
(
NodeStory
s
p
)
...
@@ -187,13 +183,14 @@ data Archive s p = Archive
...
@@ -187,13 +183,14 @@ data Archive s p = Archive
-- structure holds only recent history, the one that will be
-- structure holds only recent history, the one that will be
-- inserted to the DB.
-- inserted to the DB.
}
}
deriving
(
Generic
,
Show
)
deriving
(
Generic
,
Show
,
Eq
)
instance
(
Serialise
s
,
Serialise
p
)
=>
Serialise
(
Archive
s
p
)
instance
(
Serialise
s
,
Serialise
p
)
=>
Serialise
(
Archive
s
p
)
type
NodeListStory
=
NodeStory
NgramsState'
NgramsStatePatch'
type
NodeListStory
=
NodeStory
NgramsState'
NgramsStatePatch'
-- NOTE: 'type NgramsTableMap = Map NgramsTerm NgramsRepoElement'
type
NgramsState'
=
Map
TableNgrams
.
NgramsType
NgramsTableMap
type
NgramsState'
=
Map
TableNgrams
.
NgramsType
NgramsTableMap
type
NgramsStatePatch'
=
PatchMap
TableNgrams
.
NgramsType
NgramsTablePatch
type
NgramsStatePatch'
=
PatchMap
TableNgrams
.
NgramsType
NgramsTablePatch
instance
Serialise
NgramsStatePatch'
instance
Serialise
NgramsStatePatch'
...
@@ -211,29 +208,47 @@ instance DefaultFromField SqlJsonb (Archive NgramsState' NgramsStatePatch')
...
@@ -211,29 +208,47 @@ instance DefaultFromField SqlJsonb (Archive NgramsState' NgramsStatePatch')
combineState
::
NgramsState'
->
NgramsState'
->
NgramsState'
combineState
::
NgramsState'
->
NgramsState'
->
NgramsState'
combineState
=
Map
.
unionWith
(
<>
)
combineState
=
Map
.
unionWith
(
<>
)
instance
(
Semigroup
s
,
Semigroup
p
)
=>
Semigroup
(
Archive
s
p
)
where
-- This is not a typical Semigroup instance. The state is not
(
<>
)
(
Archive
{
_a_history
=
p
})
(
Archive
{
_a_version
=
v'
-- appended, instead it is replaced with the second entry. This is
,
_a_state
=
s'
-- because state changes with each version. We have to take into
,
_a_history
=
p'
})
=
-- account the removal of terms as well.
Archive
{
_a_version
=
v'
-- instance (Semigroup s, Semigroup p) => Semigroup (Archive s p) where
,
_a_state
=
s'
-- (<>) (Archive { _a_history = p }) (Archive { _a_version = v'
,
_a_history
=
p'
<>
p
}
-- , _a_state = s'
instance
(
Monoid
s
,
Semigroup
p
)
=>
Monoid
(
Archive
s
p
)
where
-- , _a_history = p' }) =
mempty
=
Archive
{
_a_version
=
0
-- Archive { _a_version = v'
,
_a_state
=
mempty
-- , _a_state = s'
,
_a_history
=
[]
}
-- , _a_history = p' <> p }
-- instance (Monoid s, Semigroup p) => Monoid (Archive s p) where
-- mempty = Archive { _a_version = 0
-- , _a_state = mempty
-- , _a_history = [] }
instance
(
FromJSON
s
,
FromJSON
p
)
=>
FromJSON
(
Archive
s
p
)
where
instance
(
FromJSON
s
,
FromJSON
p
)
=>
FromJSON
(
Archive
s
p
)
where
parseJSON
=
genericParseJSON
$
unPrefix
"_a_"
parseJSON
=
genericParseJSON
$
unPrefix
"_a_"
instance
(
ToJSON
s
,
ToJSON
p
)
=>
ToJSON
(
Archive
s
p
)
where
instance
(
ToJSON
s
,
ToJSON
p
)
=>
ToJSON
(
Archive
s
p
)
where
toJSON
=
genericToJSON
$
unPrefix
"_a_"
toJSON
=
genericToJSON
$
unPrefix
"_a_"
toEncoding
=
genericToEncoding
$
unPrefix
"_a_"
toEncoding
=
genericToEncoding
$
unPrefix
"_a_"
-- | This is the normal way to update archive state, bumping the
-- version and history. Resulting state is taken directly from new
-- archive, omitting old archive completely.
archiveAdvance
::
(
Semigroup
s
,
Semigroup
p
)
=>
Archive
s
p
->
Archive
s
p
->
Archive
s
p
archiveAdvance
aOld
aNew
=
aNew
{
_a_history
=
_a_history
aNew
<>
_a_history
aOld
}
-- | This is to merge archive states.
unionArchives
::
(
Semigroup
s
,
Semigroup
p
)
=>
Archive
s
p
->
Archive
s
p
->
Archive
s
p
unionArchives
aOld
aNew
=
aNew
{
_a_state
=
_a_state
aOld
<>
_a_state
aNew
,
_a_history
=
_a_history
aNew
<>
_a_history
aOld
}
------------------------------------------------------------------------
------------------------------------------------------------------------
initNodeStory
::
(
Monoid
s
,
Semigroup
p
)
=>
NodeId
->
NodeStory
s
p
initNodeStory
::
(
Monoid
s
,
Semigroup
p
)
=>
NodeId
->
NodeStory
s
p
initNodeStory
ni
=
NodeStory
$
Map
.
singleton
ni
initArchive
initNodeStory
ni
=
NodeStory
$
Map
.
singleton
ni
initArchive
initArchive
::
(
Monoid
s
,
Semigroup
p
)
=>
Archive
s
p
initArchive
::
(
Monoid
s
,
Semigroup
p
)
=>
Archive
s
p
initArchive
=
mempty
initArchive
=
Archive
{
_a_version
=
0
,
_a_state
=
mempty
,
_a_history
=
[]
}
initNodeListStoryMock
::
NodeListStory
initNodeListStoryMock
::
NodeListStory
initNodeListStoryMock
=
NodeStory
$
Map
.
singleton
nodeListId
archive
initNodeListStoryMock
=
NodeStory
$
Map
.
singleton
nodeListId
archive
...
@@ -300,6 +315,16 @@ runPGSExecuteMany c qs a = catch (PGS.executeMany c qs a) printError
...
@@ -300,6 +315,16 @@ runPGSExecuteMany c qs a = catch (PGS.executeMany c qs a) printError
_
<-
panic
$
Text
.
pack
$
show
e
_
<-
panic
$
Text
.
pack
$
show
e
throw
(
SomeException
e
)
throw
(
SomeException
e
)
runPGSReturning
::
(
PGS
.
ToRow
q
,
PGS
.
FromRow
r
)
=>
PGS
.
Connection
->
PGS
.
Query
->
[
q
]
->
IO
[
r
]
runPGSReturning
c
qs
a
=
catch
(
PGS
.
returning
c
qs
a
)
printError
where
printError
(
SomeException
e
)
=
do
--q' <- PGS.formatQuery c qs a
_
<-
panic
$
Text
.
pack
$
show
e
throw
(
SomeException
e
)
runPGSQuery
::
(
PGS
.
FromRow
r
,
PGS
.
ToRow
q
)
runPGSQuery
::
(
PGS
.
FromRow
r
,
PGS
.
ToRow
q
)
=>
PGS
.
Connection
->
PGS
.
Query
->
q
->
IO
[
r
]
=>
PGS
.
Connection
->
PGS
.
Query
->
q
->
IO
[
r
]
runPGSQuery
c
q
a
=
catch
(
PGS
.
query
c
q
a
)
printError
runPGSQuery
c
q
a
=
catch
(
PGS
.
query
c
q
a
)
printError
...
@@ -370,9 +395,6 @@ getNodesArchiveHistory c nodesId = do
...
@@ -370,9 +395,6 @@ getNodesArchiveHistory c nodesId = do
ORDER BY (version, node_story_archive_history.id) DESC
ORDER BY (version, node_story_archive_history.id) DESC
|]
|]
ngramsIdQuery
::
PGS
.
Query
ngramsIdQuery
=
[
sql
|
SELECT id FROM ngrams WHERE terms = ?
|]
insertNodeArchiveHistory
::
PGS
.
Connection
->
NodeId
->
Version
->
[
NgramsStatePatch'
]
->
IO
()
insertNodeArchiveHistory
::
PGS
.
Connection
->
NodeId
->
Version
->
[
NgramsStatePatch'
]
->
IO
()
insertNodeArchiveHistory
_
_
_
[]
=
pure
()
insertNodeArchiveHistory
_
_
_
[]
=
pure
()
...
@@ -381,22 +403,23 @@ insertNodeArchiveHistory c nodeId version (h:hs) = do
...
@@ -381,22 +403,23 @@ insertNodeArchiveHistory c nodeId version (h:hs) = do
(
\
(
term
,
p
)
->
(
\
(
term
,
p
)
->
(
nodeId
,
nType
,
term
,
p
))
<$>
PM
.
toList
patch
)
<$>
PM
.
toList
h
::
[(
NodeId
,
TableNgrams
.
NgramsType
,
NgramsTerm
,
NgramsPatch
)]
(
nodeId
,
nType
,
term
,
p
))
<$>
PM
.
toList
patch
)
<$>
PM
.
toList
h
::
[(
NodeId
,
TableNgrams
.
NgramsType
,
NgramsTerm
,
NgramsPatch
)]
tuplesM
<-
mapM
(
\
(
nId
,
nType
,
term
,
patch
)
->
do
tuplesM
<-
mapM
(
\
(
nId
,
nType
,
term
,
patch
)
->
do
ngrams
<-
runPGSQuery
c
ngramsIdQuery
(
PGS
.
Only
term
)
[
PGS
.
Only
ngramsId
]
<-
runPGSReturning
c
qInsert
[
PGS
.
Only
term
]
::
IO
[
PGS
.
Only
Int
]
pure
$
(
\
(
PGS
.
Only
termId
)
->
(
nId
,
nType
,
termId
,
term
,
patch
))
<$>
(
headMay
ngrams
)
pure
(
nId
,
nType
,
ngramsId
,
term
,
patch
)
)
tuples
::
IO
[
Maybe
(
NodeId
,
TableNgrams
.
NgramsType
,
Int
,
NgramsTerm
,
NgramsPatch
)]
)
tuples
::
IO
[
(
NodeId
,
TableNgrams
.
NgramsType
,
Int
,
NgramsTerm
,
NgramsPatch
)]
_
<-
runPGSExecuteMany
c
query
$
((
\
(
nId
,
nType
,
termId
,
_term
,
patch
)
->
(
nId
,
nType
,
termId
,
patch
,
version
))
<$>
catMaybes
tuplesM
)
_
<-
runPGSExecuteMany
c
query
$
((
\
(
nId
,
nType
,
termId
,
_term
,
patch
)
->
(
nId
,
nType
,
termId
,
patch
,
version
))
<$>
tuplesM
)
_
<-
insertNodeArchiveHistory
c
nodeId
version
hs
_
<-
insertNodeArchiveHistory
c
nodeId
version
hs
pure
()
pure
()
where
where
qInsert
::
PGS
.
Query
qInsert
=
[
sql
|
INSERT INTO ngrams (terms) VALUES (?)
ON CONFLICT (terms) DO UPDATE SET terms = excluded.terms
RETURNING id
|]
-- https://stackoverflow.com/questions/39224438/postgresql-insert-if-foreign-key-exists
-- https://stackoverflow.com/questions/39224438/postgresql-insert-if-foreign-key-exists
query
::
PGS
.
Query
query
::
PGS
.
Query
query
=
[
sql
|
INSERT INTO node_story_archive_history(node_id, ngrams_type_id, ngrams_id, patch, version)
query
=
[
sql
|
INSERT INTO node_story_archive_history(node_id, ngrams_type_id, ngrams_id, patch, version)
SELECT node_id, ngrams_type_id, ngrams_id, patch::jsonb, version FROM (
VALUES (?, ?, ?, ?, ?)
VALUES (?, ?, ?, ?, ?)
) AS i(node_id, ngrams_type_id, ngrams_id, patch, version)
|]
WHERE EXISTS (
SELECT * FROM nodes where nodes.id = node_id
)
|]
getNodeStory
::
PGS
.
Connection
->
NodeId
->
IO
NodeListStory
getNodeStory
::
PGS
.
Connection
->
NodeId
->
IO
NodeListStory
getNodeStory
c
nId
=
do
getNodeStory
c
nId
=
do
...
@@ -422,7 +445,7 @@ getNodeStory c nId = do
...
@@ -422,7 +445,7 @@ getNodeStory c nId = do
pure ()
pure ()
-}
-}
pure
$
NodeStory
$
Map
.
singleton
nId
$
foldl
combine
mempty
dbData
pure
$
NodeStory
$
Map
.
singleton
nId
$
foldl
combine
initArchive
dbData
where
where
-- NOTE (<>) for Archive doesn't concatenate states, so we have to use `combine`
-- NOTE (<>) for Archive doesn't concatenate states, so we have to use `combine`
combine
a1
a2
=
a1
&
a_state
%~
combineState
(
a2
^.
a_state
)
combine
a1
a2
=
a1
&
a_state
%~
combineState
(
a2
^.
a_state
)
...
@@ -432,7 +455,8 @@ nodeStoriesQuery :: PGS.Query
...
@@ -432,7 +455,8 @@ nodeStoriesQuery :: PGS.Query
nodeStoriesQuery
=
[
sql
|
SELECT version, ngrams_type_id, terms, ngrams_repo_element
nodeStoriesQuery
=
[
sql
|
SELECT version, ngrams_type_id, terms, ngrams_repo_element
FROM node_stories
FROM node_stories
JOIN ngrams ON ngrams.id = ngrams_id
JOIN ngrams ON ngrams.id = ngrams_id
WHERE node_id = ?
|]
WHERE node_id = ?
|]
type
ArchiveStateList
=
[(
TableNgrams
.
NgramsType
,
NgramsTerm
,
NgramsRepoElement
)]
type
ArchiveStateList
=
[(
TableNgrams
.
NgramsType
,
NgramsTerm
,
NgramsRepoElement
)]
type
ArchiveStateSet
=
Set
.
Set
(
TableNgrams
.
NgramsType
,
NgramsTerm
)
type
ArchiveStateSet
=
Set
.
Set
(
TableNgrams
.
NgramsType
,
NgramsTerm
)
...
@@ -455,42 +479,31 @@ archiveStateListFilterFromSet set =
...
@@ -455,42 +479,31 @@ archiveStateListFilterFromSet set =
-- | This function inserts whole new node story and archive for given node_id.
-- | This function inserts whole new node story and archive for given node_id.
insertNodeStory
::
PGS
.
Connection
->
NodeId
->
ArchiveList
->
IO
()
insertNodeStory
::
PGS
.
Connection
->
NodeId
->
ArchiveList
->
IO
()
insertNodeStory
c
nId
a
=
do
insertNodeStory
c
nId
a
=
do
mapM_
(
\
(
ngramsType
,
ngrams
,
ngramsRepoElement
)
->
do
insertArchiveStateList
c
nId
(
a
^.
a_version
)
(
archiveStateToList
$
a
^.
a_state
)
termIdM
<-
runPGSQuery
c
ngramsIdQuery
(
PGS
.
Only
ngrams
)
::
IO
[
PGS
.
Only
Int64
]
case
headMay
termIdM
of
Nothing
->
pure
0
Just
(
PGS
.
Only
termId
)
->
runPGSExecuteMany
c
query
[(
PGS
.
toField
nId
,
a
^.
a_version
,
ngramsType
,
termId
,
ngramsRepoElement
)])
$
archiveStateToList
$
a
^.
a_state
-- runInsert c $ insert ngramsType ngrams ngramsRepoElement) $ archiveStateToList _a_state
where
-- https://stackoverflow.com/questions/39224438/postgresql-insert-if-foreign-key-exists
query
::
PGS
.
Query
query
=
[
sql
|
INSERT INTO node_stories(node_id, ngrams_type_id, ngrams_id, ngrams_repo_element)
SELECT * FROM (
VALUES (?, ?, ?, ?)
) AS i(node_id, ngrams_type_id, ngrams_id, ngrams_repo_element)
WHERE EXISTS (
SELECT * FROM nodes where nodes.id = node_id
)
|]
-- insert ngramsType ngrams ngramsRepoElement =
-- Insert { iTable = nodeStoryTable
-- , iRows = [NodeStoryDB { node_id = sqlInt4 nId
-- , version = sqlInt4 _a_version
-- , ngrams_type_id = sqlInt4 $ TableNgrams.ngramsTypeId ngramsType
-- , ngrams_id = ...
-- , ngrams_repo_element = sqlValueJSONB ngramsRepoElement
-- }]
-- , iReturning = rCount
-- , iOnConflict = Nothing }
insertArchiveStateList
::
PGS
.
Connection
->
NodeId
->
Version
->
ArchiveStateList
->
IO
()
insertArchiveStateList
::
PGS
.
Connection
->
NodeId
->
Version
->
ArchiveStateList
->
IO
()
insertArchiveStateList
c
nodeId
version
as
=
do
insertArchiveStateList
c
nodeId
version
as
=
do
mapM_
(
\
(
nt
,
n
,
nre
)
->
runPGSExecute
c
query
(
nodeId
,
version
,
nt
,
nre
,
n
))
as
mapM_
performInsert
as
where
where
performInsert
(
ngramsType
,
ngrams
,
ngramsRepoElement
)
=
do
[
PGS
.
Only
ngramsId
]
<-
tryInsertTerms
ngrams
_
<-
case
ngramsRepoElement
^.
nre_root
of
Nothing
->
pure
[]
Just
r
->
tryInsertTerms
r
mapM_
tryInsertTerms
$
ngramsRepoElement
^.
nre_children
runPGSExecute
c
query
(
nodeId
,
ngramsId
,
version
,
ngramsType
,
ngramsRepoElement
)
tryInsertTerms
::
NgramsTerm
->
IO
[
PGS
.
Only
Int
]
tryInsertTerms
t
=
runPGSReturning
c
qInsert
[
PGS
.
Only
t
]
qInsert
::
PGS
.
Query
qInsert
=
[
sql
|
INSERT INTO ngrams (terms) VALUES (?)
ON CONFLICT (terms) DO UPDATE SET terms = excluded.terms
RETURNING id
|]
query
::
PGS
.
Query
query
::
PGS
.
Query
query
=
[
sql
|
WITH s as (SELECT ? as sid, ? sversion, ? sngrams_type_id, ngrams.id as sngrams_id, ?::jsonb as srepo FROM ngrams WHERE terms = ?)
query
=
[
sql
|
INSERT INTO node_stories(node_id, ngrams_id, version, ngrams_type_id, ngrams_repo_element)
INSERT INTO node_stories(node_id, version, ngrams_type_id, ngrams_id, ngrams_repo_element)
VALUES (?, ?, ?, ?, ? :: jsonb)
SELECT s.sid, s.sversion, s.sngrams_type_id, s.sngrams_id, s.srepo from s s join nodes n on s.sid = n.id
|]
|]
deleteArchiveStateList
::
PGS
.
Connection
->
NodeId
->
ArchiveStateList
->
IO
()
deleteArchiveStateList
::
PGS
.
Connection
->
NodeId
->
ArchiveStateList
->
IO
()
...
@@ -499,19 +512,21 @@ deleteArchiveStateList c nodeId as = do
...
@@ -499,19 +512,21 @@ deleteArchiveStateList c nodeId as = do
where
where
query
::
PGS
.
Query
query
::
PGS
.
Query
query
=
[
sql
|
DELETE FROM node_stories
query
=
[
sql
|
DELETE FROM node_stories
WHERE node_id = ? AND ngrams_type_id = ? AND ngrams_id IN (SELECT id FROM ngrams WHERE terms = ?)
|]
WHERE node_id = ? AND ngrams_type_id = ?
AND ngrams_id IN (SELECT id FROM ngrams WHERE terms = ?)
|]
updateArchiveStateList
::
PGS
.
Connection
->
NodeId
->
Version
->
ArchiveStateList
->
IO
()
updateArchiveStateList
::
PGS
.
Connection
->
NodeId
->
Version
->
ArchiveStateList
->
IO
()
updateArchiveStateList
c
nodeId
version
as
=
do
updateArchiveStateList
c
nodeId
version
as
=
do
let
params
=
(
\
(
nt
,
n
,
nre
)
->
(
nre
,
version
,
nodeId
,
nt
,
n
))
<$>
as
let
params
=
(
\
(
nt
,
n
,
nre
)
->
(
nre
,
version
,
nodeId
,
nt
,
n
))
<$>
as
--q <- PGS.format c query params
--printDebug "[updateArchiveList] query" q
mapM_
(
runPGSExecute
c
query
)
params
mapM_
(
runPGSExecute
c
query
)
params
where
where
query
::
PGS
.
Query
query
::
PGS
.
Query
query
=
[
sql
|
UPDATE node_stories
query
=
[
sql
|
UPDATE node_stories
SET ngrams_repo_element = ?, version = ?
SET ngrams_repo_element = ?, version = ?
WHERE node_id = ? AND ngrams_type_id = ? AND ngrams_id IN (SELECT id FROM ngrams WHERE terms = ?)
|]
WHERE node_id = ? AND ngrams_type_id = ?
AND ngrams_id IN (SELECT id FROM ngrams WHERE terms = ?)
|]
-- | This function updates the node story and archive for given node_id.
-- | This function updates the node story and archive for given node_id.
updateNodeStory
::
PGS
.
Connection
->
NodeId
->
ArchiveList
->
ArchiveList
->
IO
()
updateNodeStory
::
PGS
.
Connection
->
NodeId
->
ArchiveList
->
ArchiveList
->
IO
()
...
@@ -542,7 +557,7 @@ updateNodeStory c nodeId currentArchive newArchive = do
...
@@ -542,7 +557,7 @@ updateNodeStory c nodeId currentArchive newArchive = do
-- printDebug "[updateNodeStory] updates" $ Text.unlines $ (Text.pack . show) <$> updates
-- printDebug "[updateNodeStory] updates" $ Text.unlines $ (Text.pack . show) <$> updates
-- 2. Perform inserts/deletes/updates
-- 2. Perform inserts/deletes/updates
--
printDebug "[updateNodeStory] applying insert" ()
--
printDebug "[updateNodeStory] applying inserts" inserts
insertArchiveStateList
c
nodeId
(
newArchive
^.
a_version
)
inserts
insertArchiveStateList
c
nodeId
(
newArchive
^.
a_version
)
inserts
--printDebug "[updateNodeStory] insert applied" ()
--printDebug "[updateNodeStory] insert applied" ()
--TODO Use currentArchive ^. a_version in delete and report error
--TODO Use currentArchive ^. a_version in delete and report error
...
@@ -587,12 +602,12 @@ upsertNodeStories c nodeId newArchive = do
...
@@ -587,12 +602,12 @@ upsertNodeStories c nodeId newArchive = do
pure
()
pure
()
-- 3. Now we need to set versions of all node state to be the same
-- 3. Now we need to set versions of all node state to be the same
fix
NodeStoryVersion
c
nodeId
newArchive
update
NodeStoryVersion
c
nodeId
newArchive
-- printDebug "[upsertNodeStories] STOP nId" nId
-- printDebug "[upsertNodeStories] STOP nId" nId
fix
NodeStoryVersion
::
PGS
.
Connection
->
NodeId
->
ArchiveList
->
IO
()
update
NodeStoryVersion
::
PGS
.
Connection
->
NodeId
->
ArchiveList
->
IO
()
fix
NodeStoryVersion
c
nodeId
newArchive
=
do
update
NodeStoryVersion
c
nodeId
newArchive
=
do
let
ngramsTypes
=
Map
.
keys
$
newArchive
^.
a_state
let
ngramsTypes
=
Map
.
keys
$
newArchive
^.
a_state
mapM_
(
\
nt
->
runPGSExecute
c
query
(
newArchive
^.
a_version
,
nodeId
,
nt
))
ngramsTypes
mapM_
(
\
nt
->
runPGSExecute
c
query
(
newArchive
^.
a_version
,
nodeId
,
nt
))
ngramsTypes
where
where
...
@@ -607,21 +622,20 @@ writeNodeStories c (NodeStory nls) = do
...
@@ -607,21 +622,20 @@ writeNodeStories c (NodeStory nls) = do
mapM_
(
\
(
nId
,
a
)
->
upsertNodeStories
c
nId
a
)
$
Map
.
toList
nls
mapM_
(
\
(
nId
,
a
)
->
upsertNodeStories
c
nId
a
)
$
Map
.
toList
nls
-- | Returns a `NodeListStory`, updating the given one for given `NodeId`
-- | Returns a `NodeListStory`, updating the given one for given `NodeId`
nodeStoryInc
::
PGS
.
Connection
->
Maybe
NodeListStory
->
NodeId
->
IO
NodeListStory
nodeStoryInc
::
PGS
.
Connection
->
NodeListStory
->
NodeId
->
IO
NodeListStory
nodeStoryInc
c
Nothing
nId
=
getNodeStory
c
nId
nodeStoryInc
c
ns
@
(
NodeStory
nls
)
nId
=
do
nodeStoryInc
c
(
Just
ns
@
(
NodeStory
nls
))
nId
=
do
case
Map
.
lookup
nId
nls
of
case
Map
.
lookup
nId
nls
of
Nothing
->
do
Nothing
->
do
(
NodeStory
nls'
)
<-
getNodeStory
c
nId
NodeStory
nls'
<-
getNodeStory
c
nId
pure
$
NodeStory
$
Map
.
union
nls
nls'
pure
$
NodeStory
$
Map
.
union
With
archiveAdvance
nls'
nls
Just
_
->
pure
ns
Just
_
->
pure
ns
nodeStoryInc
s
::
PGS
.
Connection
->
Maybe
NodeListStory
->
[
NodeId
]
->
IO
NodeListStory
nodeStoryInc
rementalRead
::
PGS
.
Connection
->
Maybe
NodeListStory
->
[
NodeId
]
->
IO
NodeListStory
nodeStoryInc
s
_
Nothing
[]
=
pure
$
NodeStory
$
Map
.
empty
nodeStoryInc
rementalRead
_
Nothing
[]
=
pure
$
NodeStory
$
Map
.
empty
nodeStoryInc
s
c
Nothing
(
ni
:
ns
)
=
do
nodeStoryInc
rementalRead
c
Nothing
(
ni
:
ns
)
=
do
m
<-
getNodeStory
c
ni
m
<-
getNodeStory
c
ni
nodeStoryInc
s
c
(
Just
m
)
ns
nodeStoryInc
rementalRead
c
(
Just
m
)
ns
nodeStoryInc
s
c
(
Just
nls
)
ns
=
foldM
(
\
m
n
->
nodeStoryInc
c
(
Just
m
)
n
)
nls
ns
nodeStoryInc
rementalRead
c
(
Just
nls
)
ns
=
foldM
(
\
m
n
->
nodeStoryInc
c
m
n
)
nls
ns
-- nodeStoryDec :: Pool PGS.Connection -> NodeListStory -> NodeId -> IO NodeListStory
-- nodeStoryDec :: Pool PGS.Connection -> NodeListStory -> NodeId -> IO NodeListStory
-- nodeStoryDec pool ns@(NodeStory nls) ni = do
-- nodeStoryDec pool ns@(NodeStory nls) ni = do
...
@@ -635,69 +649,104 @@ nodeStoryIncs c (Just nls) ns = foldM (\m n -> nodeStoryInc c (Just m) n) nls ns
...
@@ -635,69 +649,104 @@ nodeStoryIncs c (Just nls) ns = foldM (\m n -> nodeStoryInc c (Just m) n) nls ns
-- pure $ NodeStory ns'
-- pure $ NodeStory ns'
------------------------------------
------------------------------------
readNodeStoryEnv
::
Pool
PGS
.
Connection
->
IO
NodeStoryEnv
-- | NgramsRepoElement contains, in particular, `nre_list`,
readNodeStoryEnv
pool
=
do
-- `nre_parent` and `nre_children`. We want to make sure that all
mvar
<-
nodeStoryVar
pool
Nothing
[]
-- children entries (i.e. ones that have `nre_parent`) have the same
let
saver_immediate
=
modifyMVar_
mvar
$
\
ns
->
do
-- `list` as their parent entry.
fixChildrenTermTypes
::
NodeListStory
->
NodeListStory
fixChildrenTermTypes
(
NodeStory
nls
)
=
NodeStory
$
Map
.
fromList
[
(
nId
,
a
&
a_state
%~
fixChildrenInNgramsStatePatch
)
|
(
nId
,
a
)
<-
Map
.
toList
nls
]
fixChildrenInNgramsStatePatch
::
NgramsState'
->
NgramsState'
fixChildrenInNgramsStatePatch
ns
=
archiveStateFromList
$
nsParents
<>
nsChildrenFixed
where
nls
=
archiveStateToList
ns
nsParents
=
filter
(
\
(
_nt
,
_t
,
nre
)
->
isNothing
$
nre
^.
nre_parent
)
nls
parentNtMap
=
Map
.
fromList
$
(
\
(
_nt
,
t
,
nre
)
->
(
t
,
nre
^.
nre_list
))
<$>
nsParents
nsChildren
=
filter
(
\
(
_nt
,
_t
,
nre
)
->
isJust
$
nre
^.
nre_parent
)
nls
nsChildrenFixed
=
(
\
(
nt
,
t
,
nre
)
->
(
nt
,
t
,
nre
&
nre_list
%~
(
\
l
->
parentNtMap
^.
at
(
nre
^.
nre_parent
.
_Just
)
.
non
l
)
)
)
<$>
nsChildren
-- | Sometimes, when we upload a new list, a child can be left without
-- a parent. Find such ngrams and set their 'root' and 'parent' to
-- 'Nothing'.
fixChildrenWithNoParent
::
NodeListStory
->
NodeListStory
fixChildrenWithNoParent
(
NodeStory
nls
)
=
NodeStory
$
Map
.
fromList
[
(
nId
,
a
&
a_state
%~
fixChildrenWithNoParentStatePatch
)
|
(
nId
,
a
)
<-
Map
.
toList
nls
]
fixChildrenWithNoParentStatePatch
::
NgramsState'
->
NgramsState'
fixChildrenWithNoParentStatePatch
ns
=
archiveStateFromList
$
nsParents
<>
nsChildrenFixed
where
nls
=
archiveStateToList
ns
nsParents
=
filter
(
\
(
_nt
,
_t
,
nre
)
->
isNothing
$
nre
^.
nre_parent
)
nls
parentNtMap
=
Map
.
fromList
$
(
\
(
_nt
,
t
,
nre
)
->
(
t
,
nre
^.
nre_children
&
mSetToSet
))
<$>
nsParents
nsChildren
=
filter
(
\
(
_nt
,
_t
,
nre
)
->
isJust
$
nre
^.
nre_parent
)
nls
nsChildrenFixFunc
(
nt
,
t
,
nre
)
=
(
nt
,
t
,
nre
{
_nre_root
=
root
,
_nre_parent
=
parent
}
)
where
(
root
,
parent
)
=
case
parentNtMap
^.
at
(
nre
^.
nre_parent
.
_Just
)
.
_Just
.
at
t
of
Just
_
->
(
nre
^.
nre_root
,
nre
^.
nre_parent
)
Nothing
->
(
Nothing
,
Nothing
)
nsChildrenFixed
=
nsChildrenFixFunc
<$>
nsChildren
------------------------------------
fromDBNodeStoryEnv
::
Pool
PGS
.
Connection
->
IO
NodeStoryEnv
fromDBNodeStoryEnv
pool
=
do
tvar
<-
nodeStoryVar
pool
Nothing
[]
let
saver_immediate
=
do
ns
<-
atomically
$
readTVar
tvar
-- fix children so their 'list' is the same as their parents'
>>=
pure
.
fixChildrenTermTypes
-- fix children that don't have a parent anymore
>>=
pure
.
fixChildrenWithNoParent
>>=
writeTVar
tvar
>>
readTVar
tvar
withResource
pool
$
\
c
->
do
withResource
pool
$
\
c
->
do
--printDebug "[mkNodeStorySaver] will call writeNodeStories, ns" ns
--printDebug "[mkNodeStorySaver] will call writeNodeStories, ns" ns
writeNodeStories
c
ns
writeNodeStories
c
ns
pure
ns
let
archive_saver_immediate
ns
@
(
NodeStory
nls
)
=
withResource
pool
$
\
c
->
do
let
archive_saver_immediate
ns
@
(
NodeStory
nls
)
=
withResource
pool
$
\
c
->
do
mapM_
(
\
(
nId
,
a
)
->
do
mapM_
(
\
(
nId
,
a
)
->
do
insertNodeArchiveHistory
c
nId
(
a
^.
a_version
)
$
reverse
$
a
^.
a_history
insertNodeArchiveHistory
c
nId
(
a
^.
a_version
)
$
reverse
$
a
^.
a_history
)
$
Map
.
toList
nls
)
$
Map
.
toList
nls
pure
$
clearHistory
ns
pure
$
clearHistory
ns
saver
<-
mkNodeStorySaver
saver_immediate
-- let saver = modifyMVar_ mvar $ \mv -> do
pure
$
NodeStoryEnv
{
_nse_var
=
tvar
-- writeNodeStories pool mv
-- printDebug "[readNodeStoryEnv] saver" mv
-- let mv' = clearHistory mv
-- printDebug "[readNodeStoryEnv] saver, cleared" mv'
-- pure mv'
pure
$
NodeStoryEnv
{
_nse_var
=
mvar
,
_nse_saver
=
saver
,
_nse_saver_immediate
=
saver_immediate
,
_nse_saver_immediate
=
saver_immediate
,
_nse_archive_saver_immediate
=
archive_saver_immediate
,
_nse_archive_saver_immediate
=
archive_saver_immediate
,
_nse_getter
=
nodeStoryVar
pool
(
Just
m
var
)
,
_nse_getter
=
nodeStoryVar
pool
(
Just
t
var
)
}
}
nodeStoryVar
::
Pool
PGS
.
Connection
nodeStoryVar
::
Pool
PGS
.
Connection
->
Maybe
(
M
Var
NodeListStory
)
->
Maybe
(
T
Var
NodeListStory
)
->
[
NodeId
]
->
[
NodeId
]
->
IO
(
M
Var
NodeListStory
)
->
IO
(
T
Var
NodeListStory
)
nodeStoryVar
pool
Nothing
nIds
=
do
nodeStoryVar
pool
Nothing
nIds
=
do
state'
<-
withResource
pool
$
\
c
->
nodeStoryIncs
c
Nothing
nIds
state'
<-
withResource
pool
$
\
c
->
nodeStoryIncrementalRead
c
Nothing
nIds
newMVar
state'
atomically
$
newTVar
state'
nodeStoryVar
pool
(
Just
mv
)
nIds
=
do
nodeStoryVar
pool
(
Just
tv
)
nIds
=
do
_
<-
withResource
pool
nls
<-
atomically
$
readTVar
tv
$
\
c
->
modifyMVar_
mv
nls'
<-
withResource
pool
$
\
nsl
->
nodeStoryIncs
c
(
Just
nsl
)
nIds
$
\
c
->
nodeStoryIncrementalRead
c
(
Just
nls
)
nIds
pure
mv
_
<-
atomically
$
writeTVar
tv
nls'
pure
tv
-- Debounce is useful since it could delay the saving to some later
-- time, asynchronously and we keep operating on memory only.
-- mkNodeStorySaver :: Pool PGS.Connection -> MVar NodeListStory -> IO (IO ())
-- mkNodeStorySaver pool mvns = do
mkNodeStorySaver
::
IO
()
->
IO
(
IO
()
)
mkNodeStorySaver
saver
=
mkDebounce
settings
where
settings
=
defaultDebounceSettings
{
debounceAction
=
saver
-- do
-- -- NOTE: Lock MVar first, then use resource pool.
-- -- Otherwise we could wait for MVar, while
-- -- blocking the pool connection.
-- modifyMVar_ mvns $ \ns -> do
-- withResource pool $ \c -> do
-- --printDebug "[mkNodeStorySaver] will call writeNodeStories, ns" ns
-- writeNodeStories c ns
-- pure $ clearHistory ns
--withMVar mvns (\ns -> printDebug "[mkNodeStorySaver] debounce nodestory" ns)
,
debounceFreq
=
1
*
minute
}
minute
=
60
*
sec
sec
=
10
^
(
6
::
Int
)
clearHistory
::
NodeListStory
->
NodeListStory
clearHistory
::
NodeListStory
->
NodeListStory
clearHistory
(
NodeStory
ns
)
=
NodeStory
$
ns
&
(
traverse
.
a_history
)
.~
emptyHistory
clearHistory
(
NodeStory
ns
)
=
NodeStory
$
ns
&
(
traverse
.
a_history
)
.~
emptyHistory
...
@@ -711,20 +760,9 @@ currentVersion listId = do
...
@@ -711,20 +760,9 @@ currentVersion listId = do
pure
$
nls
^.
unNodeStory
.
at
listId
.
_Just
.
a_version
pure
$
nls
^.
unNodeStory
.
at
listId
.
_Just
.
a_version
-- mkNodeStorySaver :: MVar NodeListStory -> Cmd err (Cmd err ())
-- mkNodeStorySaver mvns = mkDebounce settings
-- where
-- settings = defaultDebounceSettings
-- { debounceAction = withMVar mvns (\ns -> writeNodeStories ns)
-- , debounceFreq = 1 * minute
-- -- , debounceEdge = trailingEdge -- Trigger on the trailing edge
-- }
-- minute = 60 * second
-- second = 10^(6 :: Int)
-----------------------------------------
-----------------------------------------
-- | To be called from the REPL
fixNodeStoryVersions
::
(
HasNodeStory
env
err
m
)
=>
m
()
fixNodeStoryVersions
::
(
HasNodeStory
env
err
m
)
=>
m
()
fixNodeStoryVersions
=
do
fixNodeStoryVersions
=
do
pool
<-
view
connPool
pool
<-
view
connPool
...
...
src/Gargantext/Core/NodeStoryFile.hs
→
src/Gargantext/Core/NodeStoryFile.hs
.old
View file @
7985388f
...
@@ -22,7 +22,7 @@ import Control.Lens (view)
...
@@ -22,7 +22,7 @@ import Control.Lens (view)
import Data.ByteString.Lazy qualified as DBL
import Data.ByteString.Lazy qualified as DBL
import Data.List qualified as List
import Data.List qualified as List
import Data.Map.Strict qualified as Map
import Data.Map.Strict qualified as Map
import
Gargantext.Core.NodeStory
hiding
(
read
NodeStoryEnv
)
import Gargantext.Core.NodeStory hiding (
fromDB
NodeStoryEnv)
import Gargantext.Core.Types (ListId, NodeId(..))
import Gargantext.Core.Types (ListId, NodeId(..))
import Gargantext.Database.Prelude (hasConfig)
import Gargantext.Database.Prelude (hasConfig)
import Gargantext.Database.Query.Table.Ngrams qualified as TableNgrams
import Gargantext.Database.Query.Table.Ngrams qualified as TableNgrams
...
...
src/Gargantext/Database/Action/Flow/List.hs
View file @
7985388f
...
@@ -17,7 +17,6 @@ Portability : POSIX
...
@@ -17,7 +17,6 @@ Portability : POSIX
module
Gargantext.Database.Action.Flow.List
module
Gargantext.Database.Action.Flow.List
where
where
import
Control.Concurrent
import
Control.Lens
((
^.
),
(
+~
),
(
%~
),
at
,
(
.~
),
_Just
)
import
Control.Lens
((
^.
),
(
+~
),
(
%~
),
at
,
(
.~
),
_Just
)
import
Control.Monad.Reader
import
Control.Monad.Reader
import
Data.List
qualified
as
List
import
Data.List
qualified
as
List
...
@@ -35,6 +34,7 @@ import Gargantext.Database.Query.Table.Ngrams qualified as TableNgrams
...
@@ -35,6 +34,7 @@ import Gargantext.Database.Query.Table.Ngrams qualified as TableNgrams
import
Gargantext.Database.Query.Table.NodeNgrams
(
NodeNgramsPoly
(
..
),
NodeNgramsW
,
listInsertDb
,
{- getCgramsId -}
)
import
Gargantext.Database.Query.Table.NodeNgrams
(
NodeNgramsPoly
(
..
),
NodeNgramsW
,
listInsertDb
,
{- getCgramsId -}
)
import
Gargantext.Database.Schema.Ngrams
(
NgramsType
(
..
))
import
Gargantext.Database.Schema.Ngrams
(
NgramsType
(
..
))
import
Gargantext.Prelude
hiding
(
toList
)
import
Gargantext.Prelude
hiding
(
toList
)
import
GHC.Conc
(
readTVar
,
writeTVar
)
-- FLOW LIST
-- FLOW LIST
-- 1. select specific terms of the corpus when compared with others langs
-- 1. select specific terms of the corpus when compared with others langs
...
@@ -202,8 +202,10 @@ putListNgrams nodeId ngramsType nes = putListNgrams' nodeId ngramsType m
...
@@ -202,8 +202,10 @@ putListNgrams nodeId ngramsType nes = putListNgrams' nodeId ngramsType m
-- If valid the rest would be atomic and no merge is required.
-- If valid the rest would be atomic and no merge is required.
-}
-}
var
<-
getNodeStoryVar
[
listId
]
var
<-
getNodeStoryVar
[
listId
]
liftBase
$
modifyMVar_
var
$
\
r
->
do
liftBase
$
atomically
$
do
pure
$
r
&
unNodeStory
.
at
listId
.
_Just
.
a_version
+~
1
r
<-
readTVar
var
writeTVar
var
$
r
&
unNodeStory
.
at
listId
.
_Just
.
a_version
+~
1
&
unNodeStory
.
at
listId
.
_Just
.
a_history
%~
(
p
:
)
&
unNodeStory
.
at
listId
.
_Just
.
a_history
%~
(
p
:
)
&
unNodeStory
.
at
listId
.
_Just
.
a_state
.
at
ngramsType'
.~
Just
ns
&
unNodeStory
.
at
listId
.
_Just
.
a_state
.
at
ngramsType'
.~
Just
ns
saveNodeStory
saveNodeStory
test/Test/API/Setup.hs
View file @
7985388f
{-# LANGUAGE BangPatterns #-}
{-# LANGUAGE BangPatterns #-}
{-# LANGUAGE OverloadedStrings #-}
module
Test.API.Setup
where
module
Test.API.Setup
where
...
@@ -21,6 +22,7 @@ import Gargantext.Database.Admin.Trigger.Init
...
@@ -21,6 +22,7 @@ import Gargantext.Database.Admin.Trigger.Init
import
Gargantext.Database.Admin.Types.Hyperdata
import
Gargantext.Database.Admin.Types.Hyperdata
import
Gargantext.Database.Prelude
import
Gargantext.Database.Prelude
import
Gargantext.Database.Query.Table.Node
(
getOrMkList
)
import
Gargantext.Database.Query.Table.Node
(
getOrMkList
)
-- import Gargantext.Prelude (printDebug)
import
Gargantext.Prelude.Config
import
Gargantext.Prelude.Config
import
Gargantext.System.Logging
import
Gargantext.System.Logging
import
Network.HTTP.Client.TLS
(
newTlsManager
)
import
Network.HTTP.Client.TLS
(
newTlsManager
)
...
@@ -54,7 +56,7 @@ newTestEnv testEnv logger port = do
...
@@ -54,7 +56,7 @@ newTestEnv testEnv logger port = do
dbParam
<-
pure
$
testEnvToPgConnectionInfo
testEnv
dbParam
<-
pure
$
testEnvToPgConnectionInfo
testEnv
!
pool
<-
newPool
dbParam
!
pool
<-
newPool
dbParam
!
nodeStory_env
<-
read
NodeStoryEnv
pool
!
nodeStory_env
<-
fromDB
NodeStoryEnv
pool
!
scrapers_env
<-
ServantAsync
.
newJobEnv
ServantAsync
.
defaultSettings
manager_env
!
scrapers_env
<-
ServantAsync
.
newJobEnv
ServantAsync
.
defaultSettings
manager_env
secret
<-
Jobs
.
genSecret
secret
<-
Jobs
.
genSecret
...
@@ -101,6 +103,7 @@ setupEnvironment env = flip runReaderT env $ runTestMonad $ do
...
@@ -101,6 +103,7 @@ setupEnvironment env = flip runReaderT env $ runTestMonad $ do
(
Left
corpusMasterName
)
(
Left
corpusMasterName
)
(
Nothing
::
Maybe
HyperdataCorpus
)
(
Nothing
::
Maybe
HyperdataCorpus
)
masterListId
<-
getOrMkList
masterCorpusId
masterUserId
masterListId
<-
getOrMkList
masterCorpusId
masterUserId
-- printDebug "[setupEnvironment] masterListId: " masterListId
void
$
initLastTriggers
masterListId
void
$
initLastTriggers
masterListId
-- | Creates two users, Alice & Bob. Alice shouldn't be able to see
-- | Creates two users, Alice & Bob. Alice shouldn't be able to see
...
...
test/Test/Database/Operations.hs
View file @
7985388f
{-# LANGUAGE OverloadedStrings #-}
{-# LANGUAGE OverloadedStrings #-}
{-# LANGUAGE QuasiQuotes #-}
{-# LANGUAGE TupleSections #-}
{-# LANGUAGE TupleSections #-}
{-# LANGUAGE TypeApplications #-}
{-# LANGUAGE TypeApplications #-}
{-# LANGUAGE UndecidableInstances #-}
{-# LANGUAGE UndecidableInstances #-}
...
@@ -7,12 +8,14 @@
...
@@ -7,12 +8,14 @@
module
Test.Database.Operations
(
module
Test.Database.Operations
(
tests
tests
,
nodeStoryTests
)
where
)
where
import
Control.Monad.Except
import
Control.Monad.Except
import
Control.Monad.Reader
import
Control.Monad.Reader
import
Data.Text
qualified
as
T
import
Data.Text
qualified
as
T
import
Database.PostgreSQL.Simple
import
Database.PostgreSQL.Simple
import
Database.PostgreSQL.Simple.SqlQQ
import
Gargantext.API.Node.Corpus.Update
import
Gargantext.API.Node.Corpus.Update
import
Gargantext.Core
import
Gargantext.Core
import
Gargantext.Core.Types.Individu
import
Gargantext.Core.Types.Individu
...
@@ -20,12 +23,14 @@ import Gargantext.Database.Action.User
...
@@ -20,12 +23,14 @@ import Gargantext.Database.Action.User
import
Gargantext.Database.Action.User.New
import
Gargantext.Database.Action.User.New
import
Gargantext.Database.Admin.Types.Hyperdata.Corpus
import
Gargantext.Database.Admin.Types.Hyperdata.Corpus
import
Gargantext.Database.Admin.Types.Node
import
Gargantext.Database.Admin.Types.Node
import
Gargantext.Database.Prelude
(
runPGSQuery
)
import
Gargantext.Database.Query.Table.Node
import
Gargantext.Database.Query.Table.Node
import
Gargantext.Database.Query.Tree.Root
(
getRootId
)
import
Gargantext.Database.Query.Tree.Root
(
getRootId
)
import
Gargantext.Database.Schema.Node
(
NodePoly
(
..
))
import
Gargantext.Database.Schema.Node
(
NodePoly
(
..
))
import
Gargantext.Prelude
import
Gargantext.Prelude
import
Test.API.Setup
(
setupEnvironment
)
import
Test.API.Setup
(
setupEnvironment
)
import
Test.Database.Operations.DocumentSearch
import
Test.Database.Operations.DocumentSearch
import
Test.Database.Operations.NodeStory
import
Test.Database.Setup
(
withTestDB
)
import
Test.Database.Setup
(
withTestDB
)
import
Test.Database.Types
import
Test.Database.Types
import
Test.Hspec
import
Test.Hspec
...
@@ -64,6 +69,25 @@ tests = sequential $ aroundAll withTestDB $ describe "Database" $ do
...
@@ -64,6 +69,25 @@ tests = sequential $ aroundAll withTestDB $ describe "Database" $ do
it
"Can perform more complex searches using the boolean API"
corpusSearch03
it
"Can perform more complex searches using the boolean API"
corpusSearch03
it
"Can correctly count doc score"
corpusScore01
it
"Can correctly count doc score"
corpusScore01
nodeStoryTests
::
Spec
nodeStoryTests
=
sequential
$
-- run 'withTestDB' before _every_ test item
around
setupDBAndCorpus
$
describe
"Database - node story"
$
do
describe
"Node story"
$
do
it
"[#281] Can create a list"
createListTest
it
"[#281] Can query node story"
queryNodeStoryTest
it
"[#218] Can add new terms to node story"
insertNewTermsToNodeStoryTest
it
"[#281] Can add new terms (with children) to node story"
insertNewTermsWithChildrenToNodeStoryTest
it
"[#281] Fixes child terms to match parents' terms"
insertNodeStoryChildrenWithDifferentNgramsTypeThanParentTest
it
"[#281] Can update node story when 'setListNgrams' is called"
setListNgramsUpdatesNodeStoryTest
it
"[#281] When 'setListNgrams' is called, childrens' parents are updated"
setListNgramsUpdatesNodeStoryWithChildrenTest
it
"[#281] Correctly commits patches to node story - simple"
commitPatchSimpleTest
where
setupDBAndCorpus
testsFunc
=
withTestDB
$
\
env
->
do
setupEnvironment
env
testsFunc
env
data
ExpectedActual
a
=
data
ExpectedActual
a
=
Expected
a
Expected
a
|
Actual
a
|
Actual
a
...
@@ -126,8 +150,10 @@ corpusReadWrite01 env = do
...
@@ -126,8 +150,10 @@ corpusReadWrite01 env = do
flip
runReaderT
env
$
runTestMonad
$
do
flip
runReaderT
env
$
runTestMonad
$
do
uid
<-
getUserId
(
UserName
"alfredo"
)
uid
<-
getUserId
(
UserName
"alfredo"
)
parentId
<-
getRootId
(
UserName
"alfredo"
)
parentId
<-
getRootId
(
UserName
"alfredo"
)
[
corpusId
]
<-
mk
(
Just
"Test_Corpus"
)
(
Nothing
::
Maybe
HyperdataCorpus
)
parentId
uid
let
corpusName
=
"Test_Corpus"
liftIO
$
corpusId
`
shouldBe
`
UnsafeMkNodeId
416
[
corpusId
]
<-
mk
(
Just
corpusName
)
(
Nothing
::
Maybe
HyperdataCorpus
)
parentId
uid
[
Only
corpusId'
]
<-
runPGSQuery
[
sql
|
SELECT id FROM nodes WHERE name = ?
|]
(
Only
corpusName
)
liftIO
$
corpusId
`
shouldBe
`
UnsafeMkNodeId
corpusId'
-- Retrieve the corpus by Id
-- Retrieve the corpus by Id
[
corpus
]
<-
getCorporaWithParentId
parentId
[
corpus
]
<-
getCorporaWithParentId
parentId
liftIO
$
corpusId
`
shouldBe
`
(
_node_id
corpus
)
liftIO
$
corpusId
`
shouldBe
`
(
_node_id
corpus
)
...
...
test/Test/Database/Operations/NodeStory.hs
0 → 100644
View file @
7985388f
{-|
Module : Test.Database.Operations.NodeStory
Description : GarganText database tests
Copyright : (c) CNRS, 2017-Present
License : AGPL + CECILL v3
Maintainer : team@gargantext.org
Stability : experimental
Portability : POSIX
-}
{-# LANGUAGE OverloadedStrings #-}
{-# LANGUAGE QuasiQuotes #-}
module
Test.Database.Operations.NodeStory
where
import
Control.Lens
((
^.
),
(
.~
),
_2
)
import
Control.Monad.Reader
import
Data.Map.Strict
qualified
as
Map
import
Data.Map.Strict.Patch
qualified
as
PM
import
Data.Set
qualified
as
Set
import
Database.PostgreSQL.Simple
qualified
as
PSQL
import
Database.PostgreSQL.Simple.SqlQQ
import
Gargantext.API.Ngrams
(
commitStatePatch
,
mSetFromList
,
setListNgrams
,
saveNodeStoryImmediate
)
import
Gargantext.API.Ngrams.Types
(
MSet
(
..
),
NgramsPatch
(
..
),
NgramsRepoElement
(
..
),
NgramsTerm
(
..
),
Versioned
(
..
),
mkNgramsTablePatch
,
nre_children
,
nre_list
,
nre_parent
,
nre_root
)
import
Gargantext.API.Ngrams.Tools
(
getNodeStoryVar
)
import
Gargantext.Core.NodeStory
hiding
(
runPGSQuery
)
import
Gargantext.Core.Types.Individu
import
Gargantext.Core.Types
(
ListType
(
..
),
ListId
,
NodeId
,
UserId
)
import
Gargantext.Database.Action.User
(
getUserId
)
import
Gargantext.Database.Admin.Config
(
userMaster
)
import
Gargantext.Database.Prelude
(
runPGSQuery
)
import
Gargantext.Database.Query.Table.Ngrams
(
selectNgramsId
)
import
Gargantext.Database.Query.Table.Node
import
Gargantext.Database.Query.Tree.Root
import
Gargantext.Database.Schema.Ngrams
(
NgramsType
(
..
))
import
Gargantext.Database.Schema.Node
(
NodePoly
(
..
))
import
Gargantext.Prelude
import
GHC.Conc
(
TVar
,
readTVar
)
import
Test.Database.Types
import
Test.Hspec.Expectations
import
Test.Tasty.HUnit
commonInitialization
::
TestMonad
(
UserId
,
NodeId
,
ListId
,
TVar
NodeListStory
)
commonInitialization
=
do
let
user
=
UserName
userMaster
parentId
<-
getRootId
user
[
corpus
]
<-
getCorporaWithParentId
parentId
let
corpusId
=
_node_id
corpus
userId
<-
getUserId
user
listId
<-
getOrMkList
corpusId
userId
v
<-
getNodeStoryVar
[
listId
]
pure
$
(
userId
,
corpusId
,
listId
,
v
)
initArchiveList
::
ArchiveList
initArchiveList
=
initArchive
simpleTerm
::
(
NgramsTerm
,
NgramsRepoElement
)
simpleTerm
=
(
NgramsTerm
"hello"
,
NgramsRepoElement
{
_nre_size
=
1
,
_nre_list
=
MapTerm
,
_nre_root
=
Nothing
,
_nre_parent
=
Nothing
,
_nre_children
=
MSet
Map
.
empty
}
)
simpleParentTerm'
::
NgramsTerm
simpleParentTerm'
=
fst
simpleTerm
simpleParentTerm
::
(
NgramsTerm
,
NgramsRepoElement
)
simpleParentTerm
=
(
simpleParentTerm'
,
simpleTerm
^.
_2
&
nre_children
.~
(
mSetFromList
[
simpleChildTerm'
])
)
simpleChildTerm'
::
NgramsTerm
simpleChildTerm'
=
NgramsTerm
"world"
simpleChildTerm
::
(
NgramsTerm
,
NgramsRepoElement
)
simpleChildTerm
=
(
simpleChildTerm'
,
simpleTerm
^.
_2
&
nre_parent
.~
Just
simpleParentTerm'
&
nre_root
.~
Just
simpleParentTerm'
)
-- tests start here
createListTest
::
TestEnv
->
Assertion
createListTest
env
=
do
flip
runReaderT
env
$
runTestMonad
$
do
(
userId
,
corpusId
,
listId
,
_v
)
<-
commonInitialization
listId'
<-
getOrMkList
corpusId
userId
liftIO
$
listId
`
shouldBe
`
listId'
queryNodeStoryTest
::
TestEnv
->
Assertion
queryNodeStoryTest
env
=
do
flip
runReaderT
env
$
runTestMonad
$
do
(
_userId
,
_corpusId
,
listId
,
v
)
<-
commonInitialization
saveNodeStoryImmediate
liftIO
$
do
ns
<-
atomically
$
readTVar
v
ns
`
shouldBe
`
(
NodeStory
$
Map
.
singleton
listId
initArchiveList
)
insertNewTermsToNodeStoryTest
::
TestEnv
->
Assertion
insertNewTermsToNodeStoryTest
env
=
do
flip
runReaderT
env
$
runTestMonad
$
do
(
_userId
,
_corpusId
,
listId
,
v
)
<-
commonInitialization
let
(
terms
,
nre
)
=
simpleTerm
let
nls
=
Map
.
singleton
terms
nre
setListNgrams
listId
NgramsTerms
nls
liftIO
$
do
ns
<-
atomically
$
readTVar
v
ns
`
shouldBe
`
(
NodeStory
$
Map
.
singleton
listId
(
initArchiveList
{
_a_state
=
Map
.
singleton
NgramsTerms
nls
}))
-- check that the ngrams are in the DB as well
ngramsMap
<-
selectNgramsId
[
unNgramsTerm
terms
]
liftIO
$
(
snd
<$>
Map
.
toList
ngramsMap
)
`
shouldBe
`
[
unNgramsTerm
terms
]
-- Finally, check that node stories are inserted correctly
dbTerms
<-
runPGSQuery
[
sql
|
SELECT terms
FROM ngrams
JOIN node_stories ON ngrams.id = ngrams_id
WHERE node_id = ?
|]
(
PSQL
.
Only
listId
)
liftIO
$
dbTerms
`
shouldBe
`
[
PSQL
.
Only
$
unNgramsTerm
terms
]
insertNewTermsWithChildrenToNodeStoryTest
::
TestEnv
->
Assertion
insertNewTermsWithChildrenToNodeStoryTest
env
=
do
flip
runReaderT
env
$
runTestMonad
$
do
(
_userId
,
_corpusId
,
listId
,
v
)
<-
commonInitialization
let
(
tParent
,
nreParent
)
=
simpleParentTerm
let
(
tChild
,
nreChild
)
=
simpleChildTerm
let
terms
=
unNgramsTerm
<$>
[
tParent
,
tChild
]
let
nls
=
Map
.
fromList
[(
tParent
,
nreParent
),
(
tChild
,
nreChild
)]
setListNgrams
listId
NgramsTerms
nls
liftIO
$
do
ns
<-
atomically
$
readTVar
v
ns
`
shouldBe
`
(
NodeStory
$
Map
.
singleton
listId
(
initArchiveList
{
_a_state
=
Map
.
singleton
NgramsTerms
nls
}))
-- `setListNgrams` calls saveNodeStory already so we should have
-- the terms in the DB by now
ngramsMap
<-
selectNgramsId
terms
liftIO
$
(
snd
<$>
Map
.
toList
ngramsMap
)
`
shouldBe
`
terms
dbTerms
<-
runPGSQuery
[
sql
|
SELECT terms
FROM ngrams
JOIN node_stories ON ngrams.id = ngrams_id
WHERE node_id = ?
|]
(
PSQL
.
Only
listId
)
liftIO
$
(
Set
.
fromList
$
(
\
(
PSQL
.
Only
t
)
->
t
)
<$>
dbTerms
)
`
shouldBe
`
(
Set
.
fromList
terms
)
-- let (Just (tParentId, _)) = head $ filter ((==) (unNgramsTerm tParent) . snd) $ Map.toList ngramsMap2
-- let (Just (tChildId, _)) = head $ filter ((==) (unNgramsTerm tChild) . snd) $ Map.toList ngramsMap2
-- [PSQL.Only tParentId'] <-
-- runPGSQuery [sql|SELECT parent_id FROM ngrams WHERE terms = ?|] (PSQL.Only tChild)
-- liftIO $ tParentId `shouldBe` tParentId'
insertNodeStoryChildrenWithDifferentNgramsTypeThanParentTest
::
TestEnv
->
Assertion
insertNodeStoryChildrenWithDifferentNgramsTypeThanParentTest
env
=
do
flip
runReaderT
env
$
runTestMonad
$
do
(
_userId
,
_corpusId
,
listId
,
v
)
<-
commonInitialization
let
(
tParent
,
nreParent
)
=
simpleParentTerm
let
(
tChild
,
nreChildGoodType
)
=
simpleChildTerm
let
nreChildBrokenType
=
nreChildGoodType
&
nre_list
.~
MapTerm
let
terms
=
unNgramsTerm
<$>
[
tParent
,
tChild
]
let
nls
=
Map
.
fromList
[(
tParent
,
nreParent
),
(
tChild
,
nreChildBrokenType
)]
let
nlsWithChildFixed
=
Map
.
fromList
[(
tParent
,
nreParent
),
(
tChild
,
nreChildGoodType
)]
setListNgrams
listId
NgramsTerms
nls
liftIO
$
do
ns
<-
atomically
$
readTVar
v
ns
`
shouldBe
`
(
NodeStory
$
Map
.
singleton
listId
(
initArchiveList
{
_a_state
=
Map
.
singleton
NgramsTerms
nlsWithChildFixed
}))
ngramsMap
<-
selectNgramsId
terms
liftIO
$
(
snd
<$>
Map
.
toList
ngramsMap
)
`
shouldBe
`
terms
dbTerms
<-
runPGSQuery
[
sql
|
SELECT terms
FROM ngrams
JOIN node_stories ON ngrams.id = ngrams_id
WHERE node_id = ?
|]
(
PSQL
.
Only
listId
)
liftIO
$
(
Set
.
fromList
$
(
\
(
PSQL
.
Only
t
)
->
t
)
<$>
dbTerms
)
`
shouldBe
`
(
Set
.
fromList
terms
)
let
(
Just
(
tChildId
,
_
))
=
head
$
filter
((
==
)
(
unNgramsTerm
tChild
)
.
snd
)
$
Map
.
toList
ngramsMap
[
PSQL
.
Only
childType
]
<-
runPGSQuery
[
sql
|
SELECT ngrams_repo_element->>'list'
FROM node_stories
WHERE ngrams_id = ?
|]
(
PSQL
.
Only
tChildId
)
liftIO
$
childType
`
shouldBe
`
(
"MapTerm"
::
Text
)
setListNgramsUpdatesNodeStoryTest
::
TestEnv
->
Assertion
setListNgramsUpdatesNodeStoryTest
env
=
do
flip
runReaderT
env
$
runTestMonad
$
do
(
_userId
,
_corpusId
,
listId
,
v
)
<-
commonInitialization
let
(
terms
,
nre
)
=
simpleTerm
let
nls
=
Map
.
singleton
terms
nre
setListNgrams
listId
NgramsTerms
nls
liftIO
$
do
ns
<-
atomically
$
readTVar
v
ns
`
shouldBe
`
(
NodeStory
$
Map
.
singleton
listId
(
initArchiveList
{
_a_state
=
Map
.
singleton
NgramsTerms
nls
}))
-- check that the ngrams are in the DB as well
ngramsMap
<-
selectNgramsId
[
unNgramsTerm
terms
]
liftIO
$
(
snd
<$>
Map
.
toList
ngramsMap
)
`
shouldBe
`
[
unNgramsTerm
terms
]
let
nre2
=
NgramsRepoElement
{
_nre_size
=
1
,
_nre_list
=
MapTerm
,
_nre_root
=
Nothing
,
_nre_parent
=
Nothing
,
_nre_children
=
MSet
Map
.
empty
}
let
terms2
=
"WORLD"
let
nls2
=
Map
.
singleton
(
NgramsTerm
terms2
)
nre2
setListNgrams
listId
NgramsTerms
nls2
liftIO
$
do
ns
<-
atomically
$
readTVar
v
ns
`
shouldBe
`
(
NodeStory
$
Map
.
singleton
listId
(
initArchiveList
{
_a_state
=
Map
.
singleton
NgramsTerms
$
nls
<>
nls2
}))
setListNgramsUpdatesNodeStoryWithChildrenTest
::
TestEnv
->
Assertion
setListNgramsUpdatesNodeStoryWithChildrenTest
env
=
do
flip
runReaderT
env
$
runTestMonad
$
do
(
_userId
,
_corpusId
,
listId
,
v
)
<-
commonInitialization
let
(
tChild
,
nreChild
)
=
simpleChildTerm
let
(
tParent
,
nreParent
)
=
simpleParentTerm
let
nls
=
Map
.
fromList
[(
tParent
,
nreParent
),
(
tChild
,
nreChild
)]
setListNgrams
listId
NgramsTerms
nls
liftIO
$
do
ns
<-
atomically
$
readTVar
v
ns
`
shouldBe
`
(
NodeStory
$
Map
.
singleton
listId
(
initArchiveList
{
_a_state
=
Map
.
singleton
NgramsTerms
nls
}))
-- OK, now we substitute parent with no children, the parent of
-- 'nreChild' should become Nothing
let
nreParentNew
=
nreParent
{
_nre_children
=
MSet
$
Map
.
empty
}
let
nlsToInsert
=
Map
.
fromList
[(
tParent
,
nreParentNew
)]
setListNgrams
listId
NgramsTerms
nlsToInsert
let
nreChildNew
=
nreChild
{
_nre_parent
=
Nothing
,
_nre_root
=
Nothing
}
let
nlsNew
=
Map
.
fromList
[(
tParent
,
nreParentNew
),
(
tChild
,
nreChildNew
)]
liftIO
$
do
ns
<-
atomically
$
readTVar
v
ns
`
shouldBe
`
(
NodeStory
$
Map
.
singleton
listId
(
initArchiveList
{
_a_state
=
Map
.
singleton
NgramsTerms
nlsNew
}))
commitPatchSimpleTest
::
TestEnv
->
Assertion
commitPatchSimpleTest
env
=
do
flip
runReaderT
env
$
runTestMonad
$
do
(
_userId
,
_corpusId
,
listId
,
v
)
<-
commonInitialization
-- initially, the node story table is empty
liftIO
$
do
ns
<-
atomically
$
readTVar
v
ns
`
shouldBe
`
(
NodeStory
$
Map
.
singleton
listId
(
initArchiveList
{
_a_state
=
Map
.
empty
}))
let
(
term
,
nre
)
=
simpleTerm
let
tPatch
=
NgramsReplace
{
_patch_old
=
Nothing
,
_patch_new
=
Just
nre
}
ver
<-
currentVersion
listId
let
ntp
=
mkNgramsTablePatch
$
Map
.
singleton
term
tPatch
let
(
pm
,
_validation
)
=
PM
.
singleton
NgramsTerms
ntp
let
patch
=
Versioned
ver
pm
_patchApplied
<-
commitStatePatch
listId
patch
let
nls
=
Map
.
fromList
[(
term
,
nre
)]
liftIO
$
do
ns
<-
atomically
$
readTVar
v
ns
`
shouldBe
`
(
NodeStory
$
Map
.
singleton
listId
(
initArchiveList
{
_a_state
=
Map
.
singleton
NgramsTerms
nls
,
_a_version
=
ver
+
1
}))
test/Test/Database/Setup.hs
View file @
7985388f
...
@@ -15,6 +15,7 @@ import Database.PostgreSQL.Simple.Options qualified as Client
...
@@ -15,6 +15,7 @@ import Database.PostgreSQL.Simple.Options qualified as Client
import
Database.PostgreSQL.Simple.Options
qualified
as
Opts
import
Database.PostgreSQL.Simple.Options
qualified
as
Opts
import
Database.Postgres.Temp
qualified
as
Tmp
import
Database.Postgres.Temp
qualified
as
Tmp
import
Gargantext.API.Admin.EnvTypes
(
Mode
(
Mock
))
import
Gargantext.API.Admin.EnvTypes
(
Mode
(
Mock
))
import
Gargantext.Core.NodeStory
(
fromDBNodeStoryEnv
)
import
Gargantext.Prelude
import
Gargantext.Prelude
import
Gargantext.Prelude.Config
import
Gargantext.Prelude.Config
import
Gargantext.System.Logging
(
withLoggerHoisted
)
import
Gargantext.System.Logging
(
withLoggerHoisted
)
...
@@ -71,8 +72,13 @@ setup = do
...
@@ -71,8 +72,13 @@ setup = do
(
PG
.
close
)
2
60
2
(
PG
.
close
)
2
60
2
bootstrapDB
db
pool
gargConfig
bootstrapDB
db
pool
gargConfig
ugen
<-
emptyCounter
ugen
<-
emptyCounter
test_nodeStory
<-
fromDBNodeStoryEnv
pool
withLoggerHoisted
Mock
$
\
logger
->
do
withLoggerHoisted
Mock
$
\
logger
->
do
pure
$
TestEnv
(
DBHandle
pool
db
)
gargConfig
ugen
logger
pure
$
TestEnv
{
test_db
=
DBHandle
pool
db
,
test_config
=
gargConfig
,
test_nodeStory
,
test_usernameGen
=
ugen
,
test_logger
=
logger
}
withTestDB
::
(
TestEnv
->
IO
()
)
->
IO
()
withTestDB
::
(
TestEnv
->
IO
()
)
->
IO
()
withTestDB
=
bracket
setup
teardown
withTestDB
=
bracket
setup
teardown
...
...
test/Test/Database/Types.hs
View file @
7985388f
...
@@ -32,6 +32,7 @@ import Gargantext.API.Admin.Orchestrator.Types
...
@@ -32,6 +32,7 @@ import Gargantext.API.Admin.Orchestrator.Types
import
Gargantext.API.Prelude
import
Gargantext.API.Prelude
import
Gargantext.Core.Mail.Types
(
HasMail
(
..
))
import
Gargantext.Core.Mail.Types
(
HasMail
(
..
))
import
Gargantext.Core.NLP
(
HasNLPServer
(
..
))
import
Gargantext.Core.NLP
(
HasNLPServer
(
..
))
import
Gargantext.Core.NodeStory
import
Gargantext.Database.Prelude
(
HasConfig
(
..
),
HasConnectionPool
(
..
))
import
Gargantext.Database.Prelude
(
HasConfig
(
..
),
HasConnectionPool
(
..
))
import
Gargantext.Database.Query.Table.Node.Error
import
Gargantext.Database.Query.Table.Node.Error
import
Gargantext.Prelude.Config
import
Gargantext.Prelude.Config
...
@@ -57,6 +58,7 @@ nextCounter (Counter ref) = atomicModifyIORef' ref (\old -> (succ old, old))
...
@@ -57,6 +58,7 @@ nextCounter (Counter ref) = atomicModifyIORef' ref (\old -> (succ old, old))
data
TestEnv
=
TestEnv
{
data
TestEnv
=
TestEnv
{
test_db
::
!
DBHandle
test_db
::
!
DBHandle
,
test_config
::
!
GargConfig
,
test_config
::
!
GargConfig
,
test_nodeStory
::
!
NodeStoryEnv
,
test_usernameGen
::
!
Counter
,
test_usernameGen
::
!
Counter
,
test_logger
::
!
(
Logger
(
GargM
TestEnv
GargError
))
,
test_logger
::
!
(
Logger
(
GargM
TestEnv
GargError
))
}
}
...
@@ -107,6 +109,20 @@ instance HasMail TestEnv where
...
@@ -107,6 +109,20 @@ instance HasMail TestEnv where
,
_mc_mail_password
=
"test"
,
_mc_mail_password
=
"test"
,
_mc_mail_login_type
=
NoAuth
})
,
_mc_mail_login_type
=
NoAuth
})
instance
HasNodeStoryEnv
TestEnv
where
hasNodeStory
=
to
test_nodeStory
instance
HasNodeStoryVar
TestEnv
where
hasNodeStoryVar
=
hasNodeStory
.
nse_getter
instance
HasNodeStoryImmediateSaver
TestEnv
where
hasNodeStoryImmediateSaver
=
hasNodeStory
.
nse_saver_immediate
instance
HasNodeArchiveStoryImmediateSaver
TestEnv
where
hasNodeArchiveStoryImmediateSaver
=
hasNodeStory
.
nse_archive_saver_immediate
coreNLPConfig
::
NLPServerConfig
coreNLPConfig
::
NLPServerConfig
coreNLPConfig
=
coreNLPConfig
=
let
uri
=
parseURI
"http://localhost:9000"
let
uri
=
parseURI
"http://localhost:9000"
...
...
test/Test/Ngrams/Query.hs
View file @
7985388f
...
@@ -4,17 +4,16 @@ module Test.Ngrams.Query (tests) where
...
@@ -4,17 +4,16 @@ module Test.Ngrams.Query (tests) where
import
Control.Monad
import
Control.Monad
import
Data.Coerce
import
Data.Coerce
import
Data.Map.Strict
qualified
as
Map
import
Data.Monoid
import
Data.Monoid
import
Data.Patch.Class
qualified
as
Patch
import
Data.Text
qualified
as
T
import
Data.Validity
qualified
as
Validity
import
Gargantext.API.Ngrams
import
Gargantext.API.Ngrams
import
Gargantext.API.Ngrams.Types
import
Gargantext.API.Ngrams.Types
import
Gargantext.Core.Types.Main
import
Gargantext.Core.Types.Main
import
Gargantext.Core.Types.Query
import
Gargantext.Core.Types.Query
import
Gargantext.Prelude
import
Gargantext.Prelude
import
qualified
Data.Map.Strict
as
Map
import
qualified
Data.Patch.Class
as
Patch
import
qualified
Data.Validity
as
Validity
import
qualified
Data.Text
as
T
import
Test.Ngrams.Query.PaginationCorpus
import
Test.Ngrams.Query.PaginationCorpus
import
Test.Tasty
import
Test.Tasty
import
Test.Tasty.HUnit
import
Test.Tasty.HUnit
...
...
test/drivers/hspec/Main.hs
View file @
7985388f
...
@@ -43,3 +43,4 @@ main = do
...
@@ -43,3 +43,4 @@ main = do
bracket
startCoreNLPServer
stopCoreNLPServer
$
\
_
->
hspec
$
do
bracket
startCoreNLPServer
stopCoreNLPServer
$
\
_
->
hspec
$
do
API
.
tests
API
.
tests
DB
.
tests
DB
.
tests
DB
.
nodeStoryTests
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment