Commit 53512f89 authored by Alfredo Di Napoli's avatar Alfredo Di Napoli

Port DB operations to transactional API

This gigantic commit ports the existing DB operations in GGTX to use the
transactional API, meaning that we can now compose DB operations and
they will all run in the same Postgres transaction using the same
connection, which will eliminate those class of bugs where concurrent DB
access might result in an inconsistent state.

On top of that, we simplify some parts of the API, for which a summary
is given below:

1. The `NodeStoryEnv` management has been greatly simplified; in the new
   API we don't need an external connection pool to be passed and we
   don't have to pass IO actions, we can just pass DB operations,
   therefore we can greatly simplify the API to just pass mostly pure
   values;

2. Due to the fact that our `DBTx` monad can't do arbitrary IO (which is
   a good thing) we cannot fire Central Exchange notifications
   immediately. Rather that happens now is that we collect the
   `CEMessage` to be sent and we fire them in the relevant concrete
   monad after we finished with the DB transaction. This means that in
   principle there would be a small delay between the DB operation
   taking place and the notification firing but in practice the latency
   should be negligible and bear in mind this is typically what we want:
   if we have a long DB Tx that triggers an error in the middle we don't
   want to be sending out CE messages prematurely if the overall
   operation didn't succeed!

3. There are still a few places in the codebase where we couldn't make
   things fully compositional with regards to the DBTx API, because we
   had Servant handlers which had DB operations mixed with other IO
   effectful computations (or other things like the notification from
   the `MonadJobStatus`). For now we are splitting these functions by
   manually running the partial DB operations, and while this is not
   ideal it can be fixed in subsequent merge requests.

4. The `WorkerEnv` doesn't use `IOException` as its `MonadError`
   anymore, as for consistency we can just use `BackendInternalError` by
   adding a `InternalWorkerError` data constructor accepting the
   `IOException` triggered by the Worker monad.

More testing is needed, with particular attention to performance
(regression) but this should hopefully offer a decent baseline.
parent c0f94390
......@@ -295,6 +295,7 @@ library
Gargantext.Database.Admin.Types.Hyperdata.Document
Gargantext.Database.Admin.Types.Hyperdata.Folder
Gargantext.Database.Admin.Types.Node
Gargantext.Database.Class
Gargantext.Database.Prelude
Gargantext.Database.Query.Facet
Gargantext.Database.Query.Table.Ngrams
......@@ -544,7 +545,6 @@ library
, fgl ^>= 5.8.0.0
, filepath ^>= 1.4.2.2
, fmt
, formatting ^>= 7.2.0
, free >= 0.5.0
, fullstop ^>= 0.1.4
, gargantext-graph-core >= 0.2.0.0
......
......@@ -24,7 +24,6 @@ And you have the main viz
-}
{-# LANGUAGE MonoLocalBinds #-}
{-# LANGUAGE ScopedTypeVariables #-}
{-# LANGUAGE TypeApplications #-}
......@@ -62,7 +61,7 @@ import Gargantext.Core.Types.Individu (User(..), Username, GargPassword(..))
import Gargantext.Core.Worker.Jobs.Types qualified as Jobs
import Gargantext.Database.Action.User.New (guessUserName)
import Gargantext.Database.Admin.Types.Node (NodeId(..), UserId)
import Gargantext.Database.Prelude (Cmd, IsDBEnvExtra, IsDBCmd)
import Gargantext.Database.Prelude
import Gargantext.Database.Query.Table.User
import Gargantext.Database.Query.Tree (isDescendantOf, isIn)
import Gargantext.Database.Query.Tree.Root (getRoot)
......@@ -100,14 +99,14 @@ checkAuthRequest couldBeEmail (GargPassword p) = do
Nothing -> couldBeEmail -- we are sure this is not an email
Just (u,_) -> u -- this was an email in fact
candidate <- head <$> getUsersWith usrname
candidate <- head <$> runDBQuery (getUsersWith usrname)
case candidate of
Nothing -> pure InvalidUser
Just (UserLight { userLight_password = GargPassword h, .. }) ->
case Auth.checkPassword (Auth.mkPassword p) (Auth.PasswordHash h) of
Auth.PasswordCheckFail -> pure InvalidPassword
Auth.PasswordCheckSuccess -> do
muId <- head <$> getRoot (UserName usrname)
muId <- head <$> runDBQuery (getRoot (UserName usrname))
case _node_id <$> muId of
Nothing -> pure InvalidUser
Just nodeId -> do
......@@ -144,12 +143,13 @@ withAccessM :: ( IsDBCmd env err m )
-> m a
-> m a
withAccessM (AuthenticatedUser nodeId _userId) (PathNode id) m = do
d <- id `isDescendantOf` nodeId
d <- runDBQuery (id `isDescendantOf` nodeId)
if d then m else m -- serverError err401
withAccessM (AuthenticatedUser nodeId _userId) (PathNodeNode cId docId) m = do
_a <- isIn cId docId -- TODO use one query for all ?
_d <- cId `isDescendantOf` nodeId
runDBQuery $ do
void $ isIn cId docId -- TODO use one query for all ?
void $ (cId `isDescendantOf` nodeId)
if True -- a && d
then m
else m -- serverError err401
......@@ -249,7 +249,7 @@ forgotPasswordGet (Just uuid) = do
Nothing -> throwError $ _ServerError # err404 { errBody = "Not found" }
Just uuid' -> do
-- fetch user
us <- getUsersWithForgotPasswordUUID uuid'
us <- runDBQuery $ getUsersWithForgotPasswordUUID uuid'
case us of
[u] -> forgotPasswordGetUser u
_ -> throwError $ _ServerError # err404 { errBody = "Not found" }
......@@ -266,12 +266,10 @@ forgotPasswordGetUser (UserLight { .. }) = do
hashed <- liftBase $ Auth.hashPassword $ Auth.mkPassword password
let hashed' = Auth.unPasswordHash hashed
let userPassword = UserLight { userLight_password = GargPassword hashed', .. }
_ <- updateUserPassword userPassword
-- display this briefly in the html
-- clear the uuid so that the page can't be refreshed
_ <- updateUserForgotPasswordUUID $ UserLight { userLight_forgot_password_uuid = Nothing, .. }
runDBTx $ do
void $ updateUserPassword userPassword
void $ updateUserForgotPasswordUUID $ UserLight { userLight_forgot_password_uuid = Nothing, .. }
pure $ ForgotPasswordGet password
......@@ -286,7 +284,7 @@ forgotUserPassword (UserLight { .. }) = do
let userUUID = UserLight { userLight_forgot_password_uuid = Just $ toText uuid, .. }
-- save user with that uuid
_ <- updateUserForgotPasswordUUID userUUID
_ <- runDBTx $ updateUserForgotPasswordUUID userUUID
-- send email with uuid link
cfg <- view $ mailSettings
......@@ -304,7 +302,7 @@ generateForgotPasswordUUID :: (IsDBEnvExtra env)
=> Cmd env err UUID
generateForgotPasswordUUID = do
uuid <- liftBase $ nextRandom
us <- getUsersWithForgotPasswordUUID uuid
us <- runDBQuery $ getUsersWithForgotPasswordUUID uuid
case us of
[] -> pure uuid
_ -> generateForgotPasswordUUID
......
......@@ -80,7 +80,7 @@ modeToLoggingLevels = \case
data Env = Env
{ _env_logger :: ~(Logger (GargM Env BackendInternalError))
, _env_pool :: ~(Pool Connection)
, _env_nodeStory :: ~NodeStoryEnv
, _env_nodeStory :: ~(NodeStoryEnv BackendInternalError)
, _env_manager :: ~Manager
, _env_config :: ~GargConfig
, _env_dispatcher :: ~Dispatcher
......@@ -96,15 +96,9 @@ instance HasConfig Env where
instance HasConnectionPool Env where
connPool = env_pool
instance HasNodeStoryEnv Env where
instance HasNodeStoryEnv Env BackendInternalError where
hasNodeStory = env_nodeStory
instance HasNodeStoryImmediateSaver Env where
hasNodeStoryImmediateSaver = hasNodeStory . nse_saver_immediate
instance HasNodeArchiveStoryImmediateSaver Env where
hasNodeArchiveStoryImmediateSaver = hasNodeStory . nse_archive_saver_immediate
instance HasJWTSettings Env where
jwtSettings = env_jwt_settings
......@@ -152,7 +146,7 @@ data DevEnv = DevEnv
, _dev_env_manager :: ~Manager
, _dev_env_logger :: !(Logger (GargM DevEnv BackendInternalError))
, _dev_env_pool :: !(Pool Connection)
, _dev_env_nodeStory :: !NodeStoryEnv
, _dev_env_nodeStory :: !(NodeStoryEnv BackendInternalError)
}
makeLenses ''DevEnv
......@@ -198,15 +192,9 @@ instance HasConnectionPool DevEnv where
connPool = dev_env_pool
instance HasNodeStoryEnv DevEnv where
instance HasNodeStoryEnv DevEnv BackendInternalError where
hasNodeStory = dev_env_nodeStory
instance HasNodeStoryImmediateSaver DevEnv where
hasNodeStoryImmediateSaver = hasNodeStory . nse_saver_immediate
instance HasNodeArchiveStoryImmediateSaver DevEnv where
hasNodeArchiveStoryImmediateSaver = hasNodeStory . nse_archive_saver_immediate
instance HasMail DevEnv where
mailSettings = dev_env_config . gc_mail_config
......
......@@ -27,10 +27,10 @@ import Database.PostgreSQL.Simple (Connection, connect, close, ConnectInfo)
import Gargantext.API.Admin.EnvTypes (Env(..))
import Gargantext.API.Errors.Types (BackendInternalError)
import Gargantext.API.Prelude (GargM)
import Gargantext.Core.Notifications.Dispatcher qualified as D
import Gargantext.Core.Config (GargConfig(..))
import Gargantext.Core.Config.Types (jwtSettings)
import Gargantext.Core.NodeStory (fromDBNodeStoryEnv)
import Gargantext.Core.NodeStory (mkNodeStoryEnv)
import Gargantext.Core.Notifications.Dispatcher qualified as D
import Gargantext.Prelude
import Gargantext.System.Logging (Logger)
import Network.HTTP.Client.TLS (newTlsManager)
......@@ -150,7 +150,7 @@ newEnv logger config dispatcher = do
-- putStrLn ("Overrides: " <> show prios :: Text)
-- putStrLn ("New priorities: " <> show prios' :: Text)
!pool <- newPool $ _gc_database_config config
!nodeStory_env <- fromDBNodeStoryEnv pool
let !nodeStory_env = mkNodeStoryEnv
-- secret <- Jobs.genSecret
-- let jobs_settings = (Jobs.defaultJobSettings 1 secret)
......
......@@ -47,25 +47,25 @@ import Gargantext.API.Errors (BackendInternalError)
import Gargantext.API.Errors.Types (AccessPolicyErrorReason(..))
import Gargantext.Core.Config (GargConfig(..), HasConfig(hasConfig))
import Gargantext.Core.Config.Types (SecretsConfig(..))
import Gargantext.Core.Types.Individu (User(UserName))
import Gargantext.Core.Types (NodeId, UserId)
import Gargantext.Database.Prelude (DBCmd)
import Gargantext.Database.Query.Table.Node.Error (HasNodeError)
import Gargantext.Core.Types.Individu (User(UserName))
import Gargantext.Database.Prelude
import Gargantext.Database.Query.Table.Node (getNode)
import Gargantext.Database.Query.Table.Node.Error (HasNodeError)
import Gargantext.Database.Query.Table.NodeNode
import Gargantext.Database.Query.Tree (isDescendantOf, isOwnedBy, isSharedWith, lookupPublishPolicy)
import Gargantext.Database.Query.Tree.Root (getRootId)
import Gargantext.Database.Schema.Node (node_user_id)
import Gargantext.Prelude
import Servant (HasServer(..), ServerT)
import Servant.API.Routes (HasRoutes(getRoutes))
import Servant.Auth.Server.Internal.AddSetCookie (AddSetCookieApi, AddSetCookies(..), Nat(S))
import Servant.Client.Core (HasClient(..), Client)
import Servant.Ekg (HasEndpoint(..))
import Servant (HasServer(..), ServerT)
import Servant.OpenApi qualified as OpenAPI
import Servant.Server.Internal.Delayed (addParameterCheck)
import Servant.Server.Internal.DelayedIO (DelayedIO(..))
import Servant.Swagger qualified as Swagger
import Servant.OpenApi qualified as OpenAPI
-------------------------------------------------------------------------------
-- Types
......@@ -156,41 +156,43 @@ accessPolicyManager = AccessPolicyManager (\ur ac -> interpretPolicy ur ac)
check' :: HasNodeError err => AuthenticatedUser -> AccessCheck -> DBCmd err AccessResult
check' (AuthenticatedUser loggedUserNodeId loggedUserUserId) = \case
AC_always_deny
-> pure $ Deny invalidUserPermissions
AC_always_allow
-> pure Allow
AC_user_node requestedNodeId
-> do ownedByMe <- requestedNodeId `isOwnedBy` loggedUserUserId
enforce invalidUserPermissions $ (loggedUserNodeId == requestedNodeId || ownedByMe)
AC_user requestedUserId
-> enforce invalidUserPermissions $ (loggedUserUserId == requestedUserId)
AC_master_user _requestedNodeId
-> do
masterUsername <- _s_master_user . _gc_secrets <$> view hasConfig
masterNodeId <- getRootId (UserName masterUsername)
enforce invalidUserPermissions $ masterNodeId == loggedUserNodeId
AC_node_descendant nodeId
-> enforce nodeNotDescendant =<< nodeId `isDescendantOf` loggedUserNodeId
AC_node_shared nodeId
-> enforce nodeNotShared =<< nodeId `isSharedWith` loggedUserNodeId
AC_node_published_read nodeId
-> enforce nodeNotShared =<< isNodeReadOnly nodeId
AC_node_published_edit nodeId
-> do
mb_pp <- lookupPublishPolicy nodeId
targetNode <- getNode nodeId
let allowedOrNot = do
case mb_pp of
Nothing -> pure Allow
Just NPP_publish_no_edits_allowed
-> throwError not_editable
Just NPP_publish_edits_only_owner_or_super
-> enforce (nodeNotShared' not_editable) (targetNode ^. node_user_id == loggedUserUserId)
case allowedOrNot of
Left err -> enforce (nodeNotShared' err) False
Right _ -> pure Allow
check' (AuthenticatedUser loggedUserNodeId loggedUserUserId) c = do
cfg <- view hasConfig
runDBQuery $ case c of
AC_always_deny
-> pure $ Deny invalidUserPermissions
AC_always_allow
-> pure Allow
AC_user_node requestedNodeId
-> do ownedByMe <- requestedNodeId `isOwnedBy` loggedUserUserId
enforce invalidUserPermissions $ (loggedUserNodeId == requestedNodeId || ownedByMe)
AC_user requestedUserId
-> enforce invalidUserPermissions $ (loggedUserUserId == requestedUserId)
AC_master_user _requestedNodeId
-> do
let masterUsername = _s_master_user . _gc_secrets $ cfg
masterNodeId <- getRootId (UserName masterUsername)
enforce invalidUserPermissions $ masterNodeId == loggedUserNodeId
AC_node_descendant nodeId
-> enforce nodeNotDescendant =<< nodeId `isDescendantOf` loggedUserNodeId
AC_node_shared nodeId
-> enforce nodeNotShared =<< nodeId `isSharedWith` loggedUserNodeId
AC_node_published_read nodeId
-> enforce nodeNotShared =<< isNodeReadOnly nodeId
AC_node_published_edit nodeId
-> do
mb_pp <- lookupPublishPolicy nodeId
targetNode <- getNode nodeId
let allowedOrNot = do
case mb_pp of
Nothing -> pure Allow
Just NPP_publish_no_edits_allowed
-> throwError not_editable
Just NPP_publish_edits_only_owner_or_super
-> enforce (nodeNotShared' not_editable) (targetNode ^. node_user_id == loggedUserUserId)
case allowedOrNot of
Left err -> enforce (nodeNotShared' err) False
Right _ -> pure Allow
-------------------------------------------------------------------------------
-- Errors
......
......@@ -21,9 +21,9 @@ import Gargantext.API.Admin.Auth.Types (PathId(..), AuthenticatedUser)
import Gargantext.API.Prelude (IsGargServer)
import Gargantext.API.Routes.Named.Context qualified as Named
import Gargantext.Database.Admin.Types.Node (ContextId, contextId2NodeId)
import Gargantext.Database.Prelude (JSONB)
import Gargantext.Database.Prelude
import Gargantext.Database.Query.Table.Context (getContextWith)
import Gargantext.Prelude
import Gargantext.Prelude
import Servant.Server.Generic (AsServerT)
-------------------------------------------------------------------
......@@ -40,4 +40,4 @@ contextAPI :: ( IsGargServer env err m
contextAPI p uId id' =
withNamedAccess uId (PathNode $ contextId2NodeId id') contextAPI'
where
contextAPI' = Named.ContextAPI $ getContextWith id' p
contextAPI' = Named.ContextAPI $ runDBQuery (getContextWith id' p)
......@@ -23,7 +23,7 @@ import Gargantext.API.Prelude ( GargM )
import Gargantext.Core.Config (_gc_database_config, gc_logging)
import Gargantext.Core.Config.Types (SettingsFile(..))
import Gargantext.Core.Config.Utils (readConfig)
import Gargantext.Core.NodeStory (fromDBNodeStoryEnv)
import Gargantext.Core.NodeStory (mkNodeStoryEnv)
import Gargantext.Database.Prelude (Cmd, CmdRandom, connPool, runCmd)
import Gargantext.Prelude
import Gargantext.System.Logging ( withLoggerIO )
......@@ -41,7 +41,7 @@ withDevEnv settingsFile k = do
where
newDevEnv logger cfg = do
pool <- newPool (_gc_database_config cfg)
nodeStory_env <- fromDBNodeStoryEnv pool
let nodeStory_env = mkNodeStoryEnv
manager <- newTlsManager
pure $ DevEnv
{ _dev_env_pool = pool
......
......@@ -23,7 +23,7 @@ import Gargantext.Database.Admin.Types.Hyperdata.Contact
, cw_lastName
, hc_who, ContactWhere, hc_where, cw_organization, cw_labTeamDepts, cw_role, cw_office, cw_country, cw_city, cw_touch, ct_mail, ct_phone, ct_url, hc_title, hc_source)
import Gargantext.Database.Admin.Types.Node (ContextId (..))
import Gargantext.Database.Prelude (IsDBEnvExtra)
import Gargantext.Database.Prelude
import Gargantext.Database.Query.Table.Context (getContextWith)
import Gargantext.Database.Schema.Node (node_hyperdata)
import Gargantext.Prelude
......@@ -71,7 +71,7 @@ dbAnnuaireContacts contact_id = do
-- FIXME(adinapoli) This function seems a bit iffy, unless a 'contact_id'
-- is just a synonym for a 'ContextId'.
c <- lift $ getContextWith (UnsafeMkContextId contact_id) (Proxy :: Proxy HyperdataContact)
c <- lift $ runDBQuery $ getContextWith (UnsafeMkContextId contact_id) (Proxy :: Proxy HyperdataContact)
pure [toAnnuaireContact (contact_id, c ^. node_hyperdata)]
toAnnuaireContact :: (Int, HyperdataContact) -> AnnuaireContact
......
......@@ -33,7 +33,7 @@ import Gargantext.API.Prelude (GargM)
import Gargantext.Core.Types.Search (HyperdataRow(..), toHyperdataRow)
import Gargantext.Database.Admin.Types.Hyperdata.Document ( HyperdataDocument )
import Gargantext.Database.Admin.Types.Node (ContextTitle, NodeId(..), NodeTypeId, UserId, unNodeId, ContextId (..))
import Gargantext.Database.Prelude (IsDBEnvExtra)
import Gargantext.Database.Prelude
import Gargantext.Database.Query.Table.NodeContext (getNodeContext, getContextsForNgramsTerms, ContextForNgramsTerms(..), {- getContextNgrams, -} getContextNgramsMatchingFTS)
import Gargantext.Database.Query.Table.NodeContext qualified as DNC
import Gargantext.Database.Schema.NodeContext (NodeContext, NodeContextPoly(..))
......@@ -147,7 +147,7 @@ dbNodeContext context_id node_id = do
-- user <- getUsersWithId user_id
-- hyperdata <- getUserHyperdata user_id
-- lift (map toUser <$> zip user hyperdata)
c <- lift $ getNodeContext (UnsafeMkContextId context_id) (UnsafeMkNodeId node_id)
c <- lift $ runDBQuery $ getNodeContext (UnsafeMkContextId context_id) (UnsafeMkNodeId node_id)
pure $ toNodeContextGQL <$> [c]
-- | Returns list of `ContextGQL` for given ngrams in given corpus id.
......@@ -155,7 +155,7 @@ dbContextForNgrams
:: (IsDBEnvExtra env)
=> Int -> [Text] -> Text -> GqlM e env [ContextGQL]
dbContextForNgrams node_id ngrams_terms and_logic = do
contextsForNgramsTerms <- lift $ getContextsForNgramsTerms (UnsafeMkNodeId node_id) ngrams_terms ( readMaybe $ unpack $ Text.toTitle and_logic )
contextsForNgramsTerms <- lift $ runDBQuery $ getContextsForNgramsTerms (UnsafeMkNodeId node_id) ngrams_terms ( readMaybe $ unpack $ Text.toTitle and_logic )
--lift $ printDebug "[dbContextForNgrams] contextsForNgramsTerms" contextsForNgramsTerms
pure $ toContextGQL <$> contextsForNgramsTerms
......@@ -164,7 +164,7 @@ dbContextNgrams
:: (IsDBEnvExtra env)
=> Int -> Int -> GqlM e env [Text]
dbContextNgrams context_id list_id = do
lift $ getContextNgramsMatchingFTS (UnsafeMkContextId context_id) (UnsafeMkNodeId list_id)
lift $ runDBQuery $ getContextNgramsMatchingFTS (UnsafeMkContextId context_id) (UnsafeMkNodeId list_id)
-- Conversion functions
......@@ -228,5 +228,5 @@ updateNodeContextCategory :: (IsDBEnvExtra env)
-> GqlM' e env [Int]
updateNodeContextCategory autUser mgr NodeContextCategoryMArgs { context_id, node_id, category } =
withPolicy autUser mgr (nodeWriteChecks $ UnsafeMkNodeId node_id) $ do
void $ lift $ DNC.updateNodeContextCategory (UnsafeMkContextId context_id) (UnsafeMkNodeId node_id) category
void $ lift $ runDBTx $ DNC.updateNodeContextCategory (UnsafeMkContextId context_id) (UnsafeMkNodeId node_id) category
pure [1]
......@@ -24,7 +24,7 @@ import Gargantext.API.GraphQL.Types ( GqlM )
import Gargantext.Core ( HasDBid(lookupDBid) )
import Gargantext.Database.Admin.Types.Node (NodeType)
import Gargantext.Database.Admin.Types.Node qualified as NN
import Gargantext.Database.Prelude (IsDBEnvExtra) -- , JSONB)
import Gargantext.Database.Prelude
import Gargantext.Database.Query.Table.Node (getClosestChildrenByType, getClosestParentIdByType, getNode)
import Gargantext.Database.Schema.Node qualified as N
import Gargantext.Prelude
......@@ -74,14 +74,14 @@ dbNodes
:: (IsDBEnvExtra env)
=> Int -> GqlM e env [Node]
dbNodes node_id = do
node <- lift $ getNode $ NN.UnsafeMkNodeId node_id
node <- lift $ runDBQuery $ getNode $ NN.UnsafeMkNodeId node_id
pure [toNode node]
dbNodesCorpus
:: (IsDBEnvExtra env)
=> Int -> GqlM e env [Corpus]
dbNodesCorpus corpus_id = do
corpus <- lift $ getNode $ NN.UnsafeMkNodeId corpus_id
corpus <- lift $ runDBQuery $ getNode $ NN.UnsafeMkNodeId corpus_id
pure [toCorpus corpus]
data NodeParentArgs
......@@ -116,19 +116,21 @@ dbParentNodes node_id parentType = do
-- lift $ printDebug "[dbParentNodes] error reading parent type" (T.pack err)
-- pure []
-- Right parentType -> do
mNodeId <- lift $ getClosestParentIdByType (NN.UnsafeMkNodeId node_id) parentType -- (fromNodeTypeId parent_type_id)
lift $ runDBQuery $ do
mNodeId <- getClosestParentIdByType (NN.UnsafeMkNodeId node_id) parentType -- (fromNodeTypeId parent_type_id)
case mNodeId of
Nothing -> pure []
Just id -> do
node <- lift $ getNode id
node <- getNode id
pure [toNode node]
dbChildNodes :: (IsDBEnvExtra env)
=> Int -> NodeType -> GqlM e env [Node]
dbChildNodes node_id childType = do
childIds <- lift $ getClosestChildrenByType (NN.UnsafeMkNodeId node_id) childType -- (fromNodeTypeId parent_type_id)
children <- lift $ mapM getNode childIds
pure $ toNode <$> children
lift $ runDBQuery $ do
childIds <- getClosestChildrenByType (NN.UnsafeMkNodeId node_id) childType -- (fromNodeTypeId parent_type_id)
children <- mapM getNode childIds
pure $ toNode <$> children
toNode :: NN.Node json -> Node
toNode N.Node { .. } = Node { id = nid
......
......@@ -25,7 +25,7 @@ import Gargantext.Core.Config (HasJWTSettings)
import Gargantext.Core.Types (NodeId(..), unNodeId)
import Gargantext.Core.Types.Individu qualified as Individu
import Gargantext.Database.Action.Share (membersOf, deleteMemberShip)
import Gargantext.Database.Prelude (IsDBEnvExtra)
import Gargantext.Database.Prelude
import Gargantext.Database.Query.Table.Node (getNode)
import Gargantext.Database.Query.Table.User (getUsersWithNodeHyperdata)
import Gargantext.Database.Schema.Node (NodePoly(Node, _node_id), _node_user_id)
......@@ -60,13 +60,14 @@ dbTeam :: (IsDBEnvExtra env) =>
Int -> GqlM e env Team
dbTeam nodeId = do
let nId = UnsafeMkNodeId nodeId
res <- lift $ membersOf nId
teamNode <- lift $ getNode nId
userNodes <- lift $ getUsersWithNodeHyperdata $ Individu.UserDBId $ uId teamNode
let username = getUsername userNodes
pure $ Team { team_owner_username = username
, team_members = map toTeamMember res
}
lift $ runDBQuery $ do
res <- membersOf nId
teamNode <- getNode nId
userNodes <- getUsersWithNodeHyperdata $ Individu.UserDBId $ uId teamNode
let username = getUsername userNodes
pure $ Team { team_owner_username = username
, team_members = map toTeamMember res
}
where
toTeamMember :: (Text, NodeId) -> TeamMember
toTeamMember (username, fId)= TeamMember {
......@@ -81,18 +82,19 @@ dbTeam nodeId = do
deleteTeamMembership :: (IsDBEnvExtra env, HasJWTSettings env) =>
TeamDeleteMArgs -> GqlM' e env [Int]
deleteTeamMembership TeamDeleteMArgs { token, shared_folder_id, team_node_id } = do
teamNode <- lift $ getNode $ UnsafeMkNodeId team_node_id
userNodes <- lift (getUsersWithNodeHyperdata $ Individu.UserDBId $ uId teamNode)
userNodes <- lift $ runDBTx $ do
teamNode <- getNode $ UnsafeMkNodeId team_node_id
getUsersWithNodeHyperdata $ Individu.UserDBId $ uId teamNode
case userNodes of
[] -> panicTrace $ "[deleteTeamMembership] User with id " <> T.pack (show $ uId teamNode) <> " doesn't exist."
[] -> panicTrace $ "[deleteTeamMembership] User with id " <> T.pack (show $ team_node_id) <> " doesn't exist."
(( _, node_u):_) -> do
testAuthUser <- lift $ authUser (nId node_u) token
lift $ case testAuthUser of
case testAuthUser of
-- Invalid -> panicTrace "[deleteTeamMembership] failed to validate user"
Invalid -> do
throwError $ InternalAuthenticationError $ UserNotAuthorized (uId node_u) "This user is not team owner"
lift $ throwError $ InternalAuthenticationError $ UserNotAuthorized (uId node_u) "This user is not team owner"
Valid -> do
deleteMemberShip [(UnsafeMkNodeId shared_folder_id, UnsafeMkNodeId team_node_id)]
lift $ runDBTx $ deleteMemberShip [(UnsafeMkNodeId shared_folder_id, UnsafeMkNodeId team_node_id)]
where
uId Node { _node_user_id } = _node_user_id
nId Node { _node_id } = _node_id
......@@ -25,7 +25,7 @@ import Gargantext.Core.Types.Main ( Tree(..), _tn_node, _tn_children, NodeTree(.
import Gargantext.Database.Admin.Config ()
import Gargantext.Database.Admin.Types.Node (allNodeTypes, NodeId(..), NodeType)
import Gargantext.Database.Admin.Types.Node qualified as NN
import Gargantext.Database.Prelude (IsDBEnvExtra)
import Gargantext.Database.Prelude
import Gargantext.Database.Query.Table.Node (getNode)
import Gargantext.Database.Query.Tree qualified as T
import Gargantext.Database.Schema.Node (NodePoly(_node_parent_id))
......@@ -77,10 +77,11 @@ dbTree :: (IsDBEnvExtra env) =>
NN.UserId -> Int -> GqlM e env (TreeFirstLevel (GqlM e env))
dbTree loggedInUserId root_id = do
let rId = UnsafeMkNodeId root_id
t <- lift $ T.tree loggedInUserId T.TreeFirstLevel rId allNodeTypes
n <- lift $ getNode $ UnsafeMkNodeId root_id
let pId = toParentId n
pure $ toTree rId pId t
lift $ runDBQuery $ do
t <- T.tree loggedInUserId T.TreeFirstLevel rId allNodeTypes
n <- getNode $ UnsafeMkNodeId root_id
let pId = toParentId n
pure $ toTree rId pId t
where
toParentId N.Node { _node_parent_id } = _node_parent_id
......@@ -100,7 +101,7 @@ childrenToTreeNodes (TreeN {_tn_node}, rId) = toTreeNode (Just rId) _tn_node
resolveParent :: (IsDBEnvExtra env) => Maybe NodeId -> GqlM e env (Maybe TreeNode)
resolveParent (Just pId) = do
node <- lift $ getNode pId
node <- lift $ runDBQuery $ getNode pId
pure $ nodeToTreeNode node
resolveParent Nothing = pure Nothing
......@@ -133,6 +134,6 @@ convertDbTreeToTreeNode T.DbTreeNode { _dt_name, _dt_nodeId, _dt_typeId, _dt_par
dbRecursiveParents :: (IsDBEnvExtra env) => Int -> GqlM e env BreadcrumbInfo
dbRecursiveParents nodeId = do
let nId = UnsafeMkNodeId nodeId
dbParents <- lift $ T.recursiveParents nId allNodeTypes
dbParents <- lift $ runDBQuery $ T.recursiveParents nId allNodeTypes
let treeNodes = map convertDbTreeToTreeNode dbParents
pure $ BreadcrumbInfo { parents = treeNodes }
......@@ -22,7 +22,7 @@ import Gargantext.API.GraphQL.Types (GqlM, GqlM')
import Gargantext.Core.Types (NodeId(..), UserId)
import Gargantext.Core.Types.Individu qualified as Individu
import Gargantext.Database.Admin.Types.Hyperdata (HyperdataUser(..))
import Gargantext.Database.Prelude (IsDBEnvExtra)
import Gargantext.Database.Prelude
import Gargantext.Database.Query.Table.User qualified as DBUser
import Gargantext.Database.Schema.User (UserLight(..))
import Gargantext.Prelude
......@@ -72,7 +72,7 @@ resolveUsers autUser mgr UserArgs { user_id } = do
-- | Inner function to fetch the user from DB.
dbUsers :: (IsDBEnvExtra env)
=> Int -> GqlM e env [User (GqlM e env)]
dbUsers user_id = lift (map toUser <$> DBUser.getUsersWithId (Individu.RootId $ UnsafeMkNodeId user_id))
dbUsers user_id = lift (map toUser <$> runDBQuery (DBUser.getUsersWithId (Individu.RootId $ UnsafeMkNodeId user_id)))
toUser
:: (IsDBEnvExtra env)
......@@ -85,25 +85,25 @@ toUser (UserLight { .. }) = User { u_email = userLight_email
resolveHyperdata
:: (IsDBEnvExtra env)
=> UserId -> GqlM e env (Maybe HyperdataUser)
resolveHyperdata userid = lift (listToMaybe <$> DBUser.getUserHyperdata (Individu.UserDBId userid))
resolveHyperdata userid = lift (listToMaybe <$> runDBQuery (DBUser.getUserHyperdata (Individu.UserDBId userid)))
updateUserPubmedAPIKey :: ( IsDBEnvExtra env ) =>
UserPubmedAPIKeyMArgs -> GqlM' e env Int
updateUserPubmedAPIKey UserPubmedAPIKeyMArgs { user_id, api_key } = do
_ <- lift $ DBUser.updateUserPubmedAPIKey (Individu.RootId $ UnsafeMkNodeId user_id) api_key
_ <- lift $ runDBTx $ DBUser.updateUserPubmedAPIKey (Individu.RootId $ UnsafeMkNodeId user_id) api_key
pure 1
updateUserEPOAPIUser :: ( IsDBEnvExtra env ) =>
UserEPOAPIUserMArgs -> GqlM' e env Int
updateUserEPOAPIUser UserEPOAPIUserMArgs { user_id, api_user } = do
_ <- lift $ DBUser.updateUserEPOAPIUser (Individu.RootId $ UnsafeMkNodeId user_id) api_user
_ <- lift $ runDBTx $ DBUser.updateUserEPOAPIUser (Individu.RootId $ UnsafeMkNodeId user_id) api_user
pure 1
updateUserEPOAPIToken :: ( IsDBEnvExtra env ) =>
UserEPOAPITokenMArgs -> GqlM' e env Int
updateUserEPOAPIToken UserEPOAPITokenMArgs { user_id, api_token } = do
_ <- lift $ DBUser.updateUserEPOAPIToken (Individu.RootId $ UnsafeMkNodeId user_id) api_token
_ <- lift $ runDBTx $ DBUser.updateUserEPOAPIToken (Individu.RootId $ UnsafeMkNodeId user_id) api_token
pure 1
......@@ -49,7 +49,7 @@ import Gargantext.API.GraphQL.Utils (AuthStatus(Invalid, Valid), authUser)
import Gargantext.Core.Config (HasJWTSettings)
import Gargantext.Core.Types (UserId(..))
import Gargantext.Core.Types.Individu qualified as Individu
import Gargantext.Database.Prelude (IsDBEnvExtra)
import Gargantext.Database.Prelude
import Gargantext.Database.Query.Table.Node.UpdateOpaleye (updateHyperdata)
import Gargantext.Database.Query.Table.User (getUsersWithHyperdata, getUsersWithNodeHyperdata, updateUserEmail)
import Gargantext.Database.Schema.Node (node_id, node_hyperdata, NodePoly (Node, _node_id))
......@@ -124,7 +124,7 @@ updateUserInfo
=> UserInfoMArgs -> GqlM' e env Int
updateUserInfo (UserInfoMArgs { ui_id, .. }) = do
-- lift $ printDebug "[updateUserInfo] ui_id" ui_id
users <- lift (getUsersWithNodeHyperdata (Individu.UserDBId $ UnsafeMkUserId ui_id))
users <- lift $ runDBQuery $ getUsersWithNodeHyperdata (Individu.UserDBId $ UnsafeMkUserId ui_id)
case users of
[] -> panicTrace $ "[updateUserInfo] User with id " <> (T.pack $ show ui_id) <> " doesn't exist."
((UserLight { .. }, node_u):_) -> do
......@@ -155,10 +155,11 @@ updateUserInfo (UserInfoMArgs { ui_id, .. }) = do
let u' = UserLight { userLight_email = fromMaybe userLight_email $ view ui_cwTouchMailL u_hyperdata'
, .. }
-- lift $ printDebug "[updateUserInfo] with firstName" u_hyperdata'
_ <- lift $ updateHyperdata (node_u ^. node_id) u_hyperdata'
_ <- lift $ updateUserEmail u'
--let _newUser = toUser (u, u_hyperdata')
pure 1
lift $ runDBTx $ do
_ <- updateHyperdata (node_u ^. node_id) u_hyperdata'
_ <- updateUserEmail u'
--let _newUser = toUser (u, u_hyperdata')
pure 1
where
uh _ Nothing u_hyperdata = u_hyperdata
uh lens' (Just val) u_hyperdata = u_hyperdata & lens' ?~ val
......@@ -175,7 +176,7 @@ dbUsers user_id = do
-- user <- getUsersWithId user_id
-- hyperdata <- getUserHyperdata user_id
-- lift (map toUser <$> zip user hyperdata)
lift (map toUser <$> getUsersWithHyperdata (Individu.UserDBId user_id))
lift (map toUser <$> runDBQuery (getUsersWithHyperdata (Individu.UserDBId user_id)))
toUser :: (UserLight, HyperdataUser) -> UserInfo
toUser (UserLight { .. }, u_hyperdata) =
......
......@@ -15,6 +15,7 @@ import Gargantext.API.Routes.Named.Private qualified as Named
import Gargantext.Database.Action.Share (membersOf)
import Gargantext.Database.Admin.Types.Node (NodeType(NodeTeam))
import Gargantext.Database.Query.Table.Node (getNodesIdWithType)
import Gargantext.Database.Prelude
import Gargantext.Prelude
import Servant.Server.Generic (AsServerT)
......@@ -22,7 +23,7 @@ members :: IsGargServer err env m => Named.MembersAPI (AsServerT m)
members = Named.MembersAPI getMembers
getMembers :: IsGargServer err env m => m [Text]
getMembers = do
getMembers = runDBQuery $ do
teamNodeIds <- getNodesIdWithType NodeTeam
m <- concatMapM membersOf teamNodeIds
pure $ map fst m
This diff is collapsed.
This diff is collapsed.
......@@ -17,6 +17,7 @@ Portability : POSIX
module Gargantext.API.Ngrams.List
where
import Control.Lens (view)
import Data.Aeson qualified as Aeson
import Data.ByteString.Lazy qualified as BSL
import Data.Csv qualified as Tsv
......@@ -27,8 +28,8 @@ import Data.Map.Strict qualified as Map
import Data.Set qualified as Set
import Data.Text (concat, pack, splitOn)
import Data.Text.Encoding qualified as TE
import Data.Vector qualified as Vec
import Data.Vector (Vector)
import Data.Vector qualified as Vec
import Database.PostgreSQL.Simple.LargeObjects qualified as PSQL
import Gargantext.API.Admin.EnvTypes (Env)
import Gargantext.API.Errors.Types (BackendInternalError(InternalServerError))
......@@ -39,13 +40,13 @@ import Gargantext.API.Ngrams.Types
import Gargantext.API.Prelude (GargM, serverError, HasServerError)
import Gargantext.API.Routes.Named.List qualified as Named
import Gargantext.API.Worker (serveWorkerAPIM)
import Gargantext.Core.NodeStory.Types ( HasNodeStory )
import Gargantext.Core.NodeStory.Types ( HasNodeStory, hasNodeStory, NodeStoryEnv )
import Gargantext.Core.Text.Ngrams (Ngrams, NgramsType(NgramsTerms))
import Gargantext.Core.Types.Main (ListType(..))
import Gargantext.Core.Worker.Jobs.Types qualified as Jobs
import Gargantext.Database.Action.Flow (reIndexWith)
import Gargantext.Database.Admin.Types.Node ( NodeId(_NodeId), ListId )
import Gargantext.Database.Prelude (createLargeObject)
import Gargantext.Database.Prelude
import Gargantext.Database.Query.Table.Node (getNode)
import Gargantext.Database.Schema.Ngrams ( text2ngrams, NgramsId )
import Gargantext.Database.Schema.Node (_node_parent_id)
......@@ -78,7 +79,8 @@ getJson :: HasNodeStory env err m
=> ListId
-> m (Headers '[Header "Content-Disposition" Text] NgramsList)
getJson lId = do
lst <- getNgramsList lId
env <- view hasNodeStory
lst <- runDBQuery $ getNgramsList env lId
pure $ addHeader (concat [ "attachment; filename=GarganText_NgramsList-"
, pack $ show (_NodeId lId)
, ".json"
......@@ -89,7 +91,8 @@ getJsonZip :: HasNodeStory env err m
=> ListId
-> m (Headers '[Header "Content-Disposition" Text] NgramsListZIP)
getJsonZip lId = do
lst <- getNgramsList lId
env <- view hasNodeStory
lst <- runDBQuery $ getNgramsList env lId
let nlz = NgramsListZIP { _nlz_nl = lst, _nlz_list_id = lId}
pure $ addHeader (concat [ "attachment; filename="
, nlzFileName nlz
......@@ -101,7 +104,8 @@ getTsv :: HasNodeStory env err m
=> ListId
-> m (Headers '[Header "Content-Disposition" Text] NgramsTableMap)
getTsv lId = do
lst <- getNgramsList lId
env <- view hasNodeStory
lst <- runDBQuery $ getNgramsList env lId
pure $ case Map.lookup NgramsTerms lst of
Nothing -> noHeader Map.empty
Just (Versioned { _v_data }) ->
......@@ -122,6 +126,7 @@ jsonPostAsync = Named.JSONAPI {
}
------------------------------------------------------------------------
-- NOTE(adn) Make it DB-transactional.
postAsyncJSON :: (HasNodeStory env err m, MonadJobStatus m, MonadLogger m)
=> ListId
-> NgramsList
......@@ -129,27 +134,28 @@ postAsyncJSON :: (HasNodeStory env err m, MonadJobStatus m, MonadLogger m)
-> m ()
postAsyncJSON l ngramsList jobHandle = do
env <- view hasNodeStory
markStarted 2 jobHandle
$(logLocM) DEBUG "[postAsyncJSON] Setting the Ngrams list ..."
setList
setList env
$(logLocM) DEBUG "[postAsyncJSON] Done."
markProgress 1 jobHandle
corpus_node <- getNode l -- (Proxy :: Proxy HyperdataList)
corpus_node <- runDBQuery $ getNode l -- (Proxy :: Proxy HyperdataList)
let corpus_id = fromMaybe (panicTrace "no parent_id") (_node_parent_id corpus_node)
$(logLocM) DEBUG "[postAsyncJSON] Executing re-indexing..."
_ <- reIndexWith corpus_id l NgramsTerms (Set.fromList [MapTerm, CandidateTerm])
_ <- runDBTx $ reIndexWith env corpus_id l NgramsTerms (Set.fromList [MapTerm, CandidateTerm])
$(logLocM) DEBUG "[postAsyncJSON] Re-indexing done."
markComplete jobHandle
where
setList :: HasNodeStory env err m => m ()
setList = do
setList :: IsDBCmd env err m => NodeStoryEnv err -> m ()
setList env = do
-- TODO check with Version for optim
mapM_ (\(nt, Versioned _v ns) -> setListNgrams l nt ns) $ toList ngramsList
runDBTx $ mapM_ (\(nt, Versioned _v ns) -> setListNgrams env l nt ns) $ toList ngramsList
-- TODO reindex
......
......@@ -22,26 +22,27 @@ import Data.Map.Strict qualified as Map
import Data.Text qualified as Text
import Gargantext.API.Ngrams (getNgramsTableMap)
import Gargantext.API.Ngrams.Types
import Gargantext.Core.NodeStory.Types ( HasNodeStory )
import Gargantext.Core.NodeStory.Types ( NodeStoryEnv )
import Gargantext.Core.Text.Context (TermList)
import Gargantext.Core.Text.List.Social.Prelude ( unPatchMapToHashMap )
import Gargantext.Core.Text.Ngrams (NgramsType, ngramsTypes)
import Gargantext.Core.Types.Main ( ListType )
import Gargantext.Database.Admin.Types.Node (ListId)
import Gargantext.Prelude
import Gargantext.Database.Prelude
------------------------------------------------------------------------
getNgramsList :: HasNodeStory env err m
=> ListId -> m NgramsList
getNgramsList lId = fromList
getNgramsList :: NodeStoryEnv err
-> ListId -> DBQuery err x NgramsList
getNgramsList env lId = fromList
<$> zip ngramsTypes
<$> mapM (getNgramsTableMap lId) ngramsTypes
<$> mapM (getNgramsTableMap env lId) ngramsTypes
getTermList :: HasNodeStory env err m
=> ListId -> ListType -> NgramsType -> m (Maybe TermList)
getTermList lId listType ngramsType = do
ngramsList <- getNgramsList lId
getTermList :: NodeStoryEnv err
-> ListId -> ListType -> NgramsType -> DBQuery err x (Maybe TermList)
getTermList err lId listType ngramsType = do
ngramsList <- getNgramsList err lId
pure $ toTermList listType ngramsType ngramsList
......
......@@ -29,6 +29,7 @@ import Gargantext.Core.NodeStory.Types
import Gargantext.Core.Text.Ngrams (NgramsType)
import Gargantext.Core.Types.Main ( ListType(..) )
import Gargantext.Database.Admin.Types.Node ( NodeId, ListId )
import Gargantext.Database.Prelude
import Gargantext.Prelude
......@@ -38,14 +39,8 @@ mergeNgramsElement _neOld neNew = neNew
type RootTerm = NgramsTerm
getRepo :: HasNodeStory env err m
=> [ListId] -> m NodeListStory
getRepo listIds = do
f <- getNodeListStoryMulti
liftBase $ f listIds
-- v <- liftBase $ f listIds
-- v' <- liftBase $ atomically $ readTVar v
-- pure $ v'
getRepo :: NodeStoryEnv err -> [ListId] -> DBQuery err x NodeListStory
getRepo env listIds = getNodeListStoryMulti env listIds
repoSize :: Ord k1 => NodeStory (Map.Map k1 (Map.Map k2 a)) p
......@@ -58,28 +53,19 @@ repoSize repo node_id = Map.map Map.size state'
. a_state
getNodeStory :: HasNodeStory env err m
=> ListId -> m ArchiveList
getNodeStory l = do
f <- getNodeListStory
liftBase $ f l
-- v <- liftBase $ f l
-- pure v
getNodeStory :: NodeStoryEnv err -> ListId -> DBQuery err x ArchiveList
getNodeStory env l = getNodeListStory env l
getNodeListStory :: NodeStoryEnv err
-> NodeId
-> DBQuery err x ArchiveList
getNodeListStory env = view nse_getter env
getNodeListStory :: HasNodeStory env err m
=> m (NodeId -> IO ArchiveList)
getNodeListStory = do
env <- view hasNodeStory
pure $ view nse_getter env
getNodeListStoryMulti :: HasNodeStory env err m
=> m ([NodeId] -> IO NodeListStory)
getNodeListStoryMulti = do
env <- view hasNodeStory
pure $ view nse_getter_multi env
getNodeListStoryMulti :: NodeStoryEnv err
-> [NodeId]
-> DBQuery err x NodeListStory
getNodeListStoryMulti = view nse_getter_multi
listNgramsFromRepo :: [ListId]
......@@ -102,25 +88,27 @@ listNgramsFromRepo nodeIds ngramsType repo =
-- Add a static capability parameter would be nice.
-- Ideally this is the access to `repoVar` which needs to
-- be properly guarded.
getListNgrams :: HasNodeStory env err m
=> [ListId] -> NgramsType
-> m (HashMap NgramsTerm NgramsRepoElement)
getListNgrams nodeIds ngramsType = listNgramsFromRepo nodeIds ngramsType
<$> getRepo nodeIds
getListNgrams :: NodeStoryEnv err
-> [ListId]
-> NgramsType
-> DBQuery err x (HashMap NgramsTerm NgramsRepoElement)
getListNgrams env nodeIds ngramsType = listNgramsFromRepo nodeIds ngramsType
<$> getRepo env nodeIds
-- | Fetch terms from repo, gathering terms under the same root (parent).
getTermsWith :: forall a env err m.
(HasNodeStory env err m, Eq a, Hashable a)
=> (NgramsTerm -> a) -> [ListId]
-> NgramsType -> Set ListType
-> m (HashMap a [a])
getTermsWith f ls ngt lts = HM.fromListWith (<>)
<$> map toTreeWith
<$> HM.toList
<$> HM.filter (\f' -> Set.member (fst f') lts)
<$> mapTermListRoot ls ngt
<$> getRepo ls
getTermsWith :: forall a err x. Hashable a
=> NodeStoryEnv err
-> (NgramsTerm -> a) -> [ListId]
-> NgramsType -> Set ListType
-> DBQuery err x (HashMap a [a])
getTermsWith env f ls ngt lts =
let func = HM.fromListWith (<>)
. map toTreeWith
. HM.toList
. HM.filter (\f' -> Set.member (fst f') lts)
. mapTermListRoot ls ngt
in func <$> getRepo env ls
where
toTreeWith :: (NgramsTerm, (b, Maybe NgramsTerm)) -> (a, [a])
toTreeWith (t, (_lt, maybeRoot)) = case maybeRoot of
......
This diff is collapsed.
......@@ -17,6 +17,7 @@ module Gargantext.API.Node.Corpus.Export
where
import Control.Exception.Safe qualified as CES
import Control.Lens (view)
import Data.List qualified as List
import Data.Map.Strict qualified as Map
import Data.Set qualified as Set
......@@ -27,10 +28,12 @@ import Gargantext.API.Node.Corpus.Export.Types ( Corpus(..), CorpusSQLite(..) )
import Gargantext.API.Node.Corpus.Export.Utils (getContextNgrams, mkCorpusSQLite, mkCorpusSQLiteData)
import Gargantext.API.Node.Document.Export.Types qualified as DocumentExport
import Gargantext.API.Prelude (IsGargServer)
import Gargantext.Core.NodeStory
import Gargantext.Core.Text.Ngrams (NgramsType(..))
import Gargantext.Core.Types.Main ( ListType(MapTerm) )
import Gargantext.Database.Admin.Types.Node
import Gargantext.Database.Admin.Types.Hyperdata.Document ( HyperdataDocument(..) )
import Gargantext.Database.Admin.Types.Node
import Gargantext.Database.Prelude
import Gargantext.Database.Query.Table.Node ( defaultList )
import Gargantext.Database.Query.Table.NodeContext (selectDocNodes)
import Gargantext.Database.Schema.Context (_context_id)
......@@ -56,35 +59,37 @@ getCorpus cId = Named.CorpusExportAPI {
-> Maybe NgramsType
-> m (Headers '[Header "Content-Disposition" Text] Corpus)
get_corpus lId nt' = do
let
nt = fromMaybe NgramsTerms nt'
env <- view hasNodeStory
runDBQuery $ do
let
nt = fromMaybe NgramsTerms nt'
listId <- case lId of
Nothing -> defaultList cId
Just l -> pure l
listId <- case lId of
Nothing -> defaultList cId
Just l -> pure l
-- FIXME(adn) Audit the usage of this, we are converting from a node
-- to a context id.
ns <- Map.fromList
<$> map (\n -> (nodeId2ContextId $ _context_id n, n))
<$> selectDocNodes cId
-- FIXME(adn) Audit the usage of this, we are converting from a node
-- to a context id.
ns <- Map.fromList
<$> map (\n -> (nodeId2ContextId $ _context_id n, n))
<$> selectDocNodes cId
repo <- getRepo [listId]
ngs <- getContextNgrams cId listId MapTerm nt repo
let -- uniqId is hash computed already for each document imported in database
r = Map.intersectionWith
(\a b -> DocumentExport.Document { _d_document = context2node a
, _d_ngrams = DocumentExport.Ngrams (Set.toList b) (hash b)
, _d_hash = d_hash a b }
) ns (Map.map (Set.map unNgramsTerm) ngs)
where
d_hash :: Context HyperdataDocument -> Set Text -> Text
d_hash _a b = hash [ -- fromMaybe "" (_hd_uniqId $ _context_hyperdata a),
hash b
]
pure $ addHeader ("attachment; filename=GarganText_corpus-" <> pack (show cId) <> ".json")
$ Corpus { _c_corpus = Map.elems r
, _c_hash = hash $ List.map DocumentExport._d_hash $ Map.elems r }
repo <- getRepo env [listId]
ngs <- getContextNgrams cId listId MapTerm nt repo
let -- uniqId is hash computed already for each document imported in database
r = Map.intersectionWith
(\a b -> DocumentExport.Document { _d_document = context2node a
, _d_ngrams = DocumentExport.Ngrams (Set.toList b) (hash b)
, _d_hash = d_hash a b }
) ns (Map.map (Set.map unNgramsTerm) ngs)
where
d_hash :: Context HyperdataDocument -> Set Text -> Text
d_hash _a b = hash [ -- fromMaybe "" (_hd_uniqId $ _context_hyperdata a),
hash b
]
pure $ addHeader ("attachment; filename=GarganText_corpus-" <> pack (show cId) <> ".json")
$ Corpus { _c_corpus = Map.elems r
, _c_hash = hash $ List.map DocumentExport._d_hash $ Map.elems r }
getCorpusSQLite :: ( CES.MonadMask m
......
......@@ -39,7 +39,7 @@ import Gargantext.API.Node.Types
import Gargantext.Core (withDefaultLanguage, defaultLanguage)
import Gargantext.Core.Config (gc_jobs, hasConfig)
import Gargantext.Core.Config.Types (jc_max_docs_parsers)
import Gargantext.Core.NodeStory (HasNodeStoryImmediateSaver, HasNodeArchiveStoryImmediateSaver, currentVersion, NgramsStatePatch', HasNodeStoryEnv)
import Gargantext.Core.NodeStory (currentVersion, NgramsStatePatch', HasNodeStoryEnv (..))
import Gargantext.Core.Text.Corpus.Parsers qualified as Parser (FileType(..), parseFormatC, _ParseFormatError)
import Gargantext.Core.Text.Corpus.Parsers.Types
import Gargantext.Core.Text.Corpus.Query qualified as API
......@@ -54,9 +54,8 @@ import Gargantext.Database.Admin.Types.Hyperdata.Document ( ToHyperdataDocument(
import Gargantext.Database.Admin.Types.Hyperdata.File ( HyperdataFile(..) )
import Gargantext.Database.Admin.Types.Node (CorpusId, NodeType(..), ParentId)
import Gargantext.Database.GargDB qualified as GargDB
import Gargantext.Database.Prelude (readLargeObject, IsDBCmd)
import Gargantext.Database.Prelude
import Gargantext.Database.Query.Table.Node (getNodeWith, getOrMkList)
import Gargantext.Database.Query.Table.Node.Error (HasNodeError)
import Gargantext.Database.Query.Table.Node.UpdateOpaleye (updateHyperdata)
import Gargantext.Database.Query.Tree.Root (MkCorpusUser(MkCorpusUserNormalCorpusIds))
import Gargantext.Database.Schema.Node (node_hyperdata)
......@@ -64,6 +63,7 @@ import Gargantext.Prelude
import Gargantext.System.Logging ( logLocM, LogLevel(..) )
import Gargantext.Utils.Jobs.Error (HumanFriendlyErrorText(..))
import Gargantext.Utils.Jobs.Monad (JobHandle, MonadJobStatus(..))
import Gargantext.Database.Query.Table.Node.Error (HasNodeError)
------------------------------------------------------------------------
{-
......@@ -150,8 +150,6 @@ type AddWithFile = Summary "Add with MultipartData to corpus endpoint"
addToCorpusWithQuery :: ( FlowCmdM env err m
, MonadJobStatus m
, HasNodeStoryImmediateSaver env
, HasNodeArchiveStoryImmediateSaver env
)
=> User
-> CorpusId
......@@ -169,7 +167,7 @@ addToCorpusWithQuery user cid (WithQuery { _wq_query = q
$(logLocM) DEBUG $ "[addToCorpusWithQuery] flowListWith " <> show flw
$(logLocM) DEBUG $ "[addToCorpusWithQuery] addLanguageToCorpus " <> show cid <> ", " <> show l
addLanguageToCorpus cid l
runDBTx $ addLanguageToCorpus cid l
$(logLocM) DEBUG "[addToCorpusWithQuery] after addLanguageToCorpus"
case datafield of
......@@ -218,11 +216,10 @@ addToCorpusWithQuery user cid (WithQuery { _wq_query = q
$(logLocM) ERROR $ "[addToCorpusWithQuery] error: " <> show err -- log the full error
markFailed (Just err) jobHandle
-- | TODO(adn) Make DB transactional.
addToCorpusWithTempFile :: ( MonadMask m
, FlowCmdM env err m
, MonadJobStatus m
, HasNodeStoryImmediateSaver env
, HasNodeArchiveStoryImmediateSaver env
)
=> User
-> CorpusId
......@@ -233,7 +230,7 @@ addToCorpusWithTempFile user cid nwtf jobHandle = do
$(logLocM) DEBUG $ "Adding documents to corpus: " <> show cid
let l = nwtf ^. wtf_lang . non defaultLanguage
addLanguageToCorpus cid l
runDBTx $ addLanguageToCorpus cid l
limit' <- view $ hasConfig . gc_jobs . jc_max_docs_parsers
let limit = fromIntegral limit' :: Integer
......@@ -331,6 +328,7 @@ addToCorpusWithFile cid input filetype logStatus = do
}
-}
-- NOTE(adn) Not DB-transactional!!
addToCorpusWithFile :: (FlowCmdM env err m, MonadJobStatus m)
=> User
-> CorpusId
......@@ -339,7 +337,7 @@ addToCorpusWithFile :: (FlowCmdM env err m, MonadJobStatus m)
-> m ()
addToCorpusWithFile user cid nwf@(NewWithFile _d (withDefaultLanguage -> l) fName) jobHandle = do
addLanguageToCorpus cid l
runDBTx $ addLanguageToCorpus cid l
$(logLocM) DEBUG $ "[addToCorpusWithFile] Uploading file to corpus: " <> show cid
markStarted 1 jobHandle
......@@ -347,15 +345,19 @@ addToCorpusWithFile user cid nwf@(NewWithFile _d (withDefaultLanguage -> l) fNam
fPath <- GargDB.writeFile nwf
$(logLocM) DEBUG $ "[addToCorpusWithFile] File saved as: " <> show fPath
uId <- getUserId user
nIds <- mkNodeWithParent NodeFile (Just cid) uId fName
cfg <- view hasConfig
nIds <- runDBTx $ do
uId <- getUserId user
mkNodeWithParent cfg NodeFile (Just cid) uId fName
_ <- case nIds of
[nId] -> do
node <- getNodeWith nId (Proxy :: Proxy HyperdataFile)
let hl = node ^. node_hyperdata
_ <- updateHyperdata nId $ hl { _hff_name = fName
, _hff_path = T.pack fPath }
runDBTx $ do
node <- getNodeWith nId (Proxy :: Proxy HyperdataFile)
let hl = node ^. node_hyperdata
void $ updateHyperdata nId $ hl { _hff_name = fName
, _hff_path = T.pack fPath }
$(logLocM) DEBUG $ "[addToCorpusWithFile] Created node with id: " <> show nId
_ -> pure ()
......@@ -371,16 +373,15 @@ addToCorpusWithFile user cid nwf@(NewWithFile _d (withDefaultLanguage -> l) fNam
--- UTILITIES
commitCorpus :: ( IsDBCmd env err m
, HasNodeStoryEnv env
, HasNodeError err
, HasNodeArchiveStoryImmediateSaver env
, HasNodeStoryImmediateSaver env )
commitCorpus :: ( IsDBCmd env err m, HasNodeStoryEnv env err, HasNodeError err
)
=> ParentId
-> User
-> m (Versioned NgramsStatePatch')
commitCorpus cid user = do
userId <- getUserId user
listId <- getOrMkList cid userId
v <- currentVersion listId
commitStatePatch listId (Versioned v mempty)
env <- view hasNodeStory
runDBTx $ do
userId <- getUserId user
listId <- getOrMkList cid userId
v <- currentVersion listId
commitStatePatch env listId (Versioned v mempty)
......@@ -35,6 +35,7 @@ import Gargantext.Database.Action.User (getUserId)
import Gargantext.Database.Admin.Types.Hyperdata.Corpus (HyperdataCorpus)
import Gargantext.Database.Admin.Types.Hyperdata.Document (HyperdataDocument(..))
import Gargantext.Database.Admin.Types.Node (CorpusId, ListId, NodeType(NodeTexts))
import Gargantext.Database.Prelude
import Gargantext.Database.Query.Table.Node (getOrMkList, insertDefaultNodeIfNotExists)
import Gargantext.Database.Query.Table.Node.Error (HasNodeError)
import Gargantext.Database.Query.Tree.Error (HasTreeError)
......@@ -146,7 +147,6 @@ insertSearxResponse user cId listId l (Right (SearxResponse { _srs_results })) =
pure ()
-- TODO Make an async task out of this?
triggerSearxSearch :: ( MonadBase IO m
, HasNodeStory env err m
......@@ -163,9 +163,9 @@ triggerSearxSearch :: ( MonadBase IO m
-> JobHandle m
-> m ()
triggerSearxSearch user cId q l jobHandle = do
userId <- getUserId user
_tId <- insertDefaultNodeIfNotExists NodeTexts cId userId
runDBTx $ do
userId <- getUserId user
void $ insertDefaultNodeIfNotExists NodeTexts cId userId
let numPages = 100
markStarted numPages jobHandle
......@@ -174,10 +174,12 @@ triggerSearxSearch user cId q l jobHandle = do
-- printDebug "[triggerSearxSearch] q" q
-- printDebug "[triggerSearxSearch] l" l
cfg <- view hasConfig
uId <- getUserId user
let surl = _f_searx_url $ _gc_frames cfg
-- printDebug "[triggerSearxSearch] surl" surl
listId <- getOrMkList cId uId
listId <- runDBTx $ do
uId <- getUserId user
-- printDebug "[triggerSearxSearch] surl" surl
getOrMkList cId uId
-- printDebug "[triggerSearxSearch] listId" listId
......
......@@ -14,7 +14,7 @@ import Gargantext.Database.Prelude (IsDBCmd)
import Servant.Server.Generic (AsServerT)
makeSubcorpus :: ( HasNodeStoryEnv env
makeSubcorpus :: ( HasNodeStoryEnv env BackendInternalError
, HasNLPServer env
, IsDBCmd env BackendInternalError m
)
......
......@@ -19,20 +19,19 @@ import Control.Lens (over)
import Gargantext.Core (Lang)
import Gargantext.Database.Admin.Types.Hyperdata.Corpus (HyperdataCorpus, _hc_lang)
import Gargantext.Database.Admin.Types.Node (CorpusId)
import Gargantext.Database.Prelude (IsDBCmd)
import Gargantext.Database.Prelude
import Gargantext.Database.Query.Table.Node (getNodeWith)
import Gargantext.Database.Query.Table.Node.Error (HasNodeError)
import Gargantext.Database.Query.Table.Node.UpdateOpaleye (updateHyperdata)
import Gargantext.Database.Schema.Node (node_hyperdata)
import Gargantext.Prelude
import Gargantext.Utils.Jobs.Monad (MonadJobStatus)
-- | Updates the 'HyperdataCorpus' with the input 'Lang'.
addLanguageToCorpus :: (HasNodeError err, IsDBCmd env err m, MonadJobStatus m)
addLanguageToCorpus :: HasNodeError err
=> CorpusId
-> Lang
-> m ()
-> DBUpdate err ()
addLanguageToCorpus cId lang = do
hyperNode <- getNodeWith cId (Proxy @HyperdataCorpus)
let hyperNode' = hyperNode & over node_hyperdata (\corpus -> corpus { _hc_lang = Just lang })
......
......@@ -28,6 +28,7 @@ import Gargantext.API.Prelude (IsGargServer)
import Gargantext.API.Routes.Named.Document qualified as Named
import Gargantext.Core (toDBid)
import Gargantext.Database.Admin.Types.Node (DocId, NodeId, NodeType(..))
import Gargantext.Database.Prelude
import Gargantext.Database.Query.Facet (runViewDocuments, Facet(..))
import Gargantext.Database.Query.Table.Node (getClosestParentIdByType)
import Gargantext.Database.Query.Table.Node.User ( getNodeUser )
......@@ -64,12 +65,13 @@ getDocumentsJSON nodeUserId pId = do
get_document_json :: IsGargServer err env m => NodeId -> DocId -> m DocumentExport
get_document_json nodeUserId pId = do
uId <- view node_user_id <$> getNodeUser nodeUserId
mcId <- getClosestParentIdByType pId NodeCorpus
let cId = maybe (panicTrace "[G.A.N.D.Export] Node has no parent") identity mcId
docs <- runViewDocuments cId False Nothing Nothing Nothing Nothing Nothing
pure DocumentExport { _de_documents = mapFacetDoc uId <$> docs
, _de_garg_version = T.pack $ showVersion PG.version }
runDBQuery $ do
uId <- view node_user_id <$> getNodeUser nodeUserId
mcId <- getClosestParentIdByType pId NodeCorpus
let cId = maybe (panicTrace "[G.A.N.D.Export] Node has no parent") identity mcId
docs <- runViewDocuments cId False Nothing Nothing Nothing Nothing Nothing
pure DocumentExport { _de_documents = mapFacetDoc uId <$> docs
, _de_garg_version = T.pack $ showVersion PG.version }
where
mapFacetDoc uId (FacetDoc { .. }) =
Document { _d_document =
......
......@@ -28,7 +28,7 @@ import Gargantext.API.Routes.Named.Document qualified as Named
import Gargantext.API.Worker (serveWorkerAPI)
import Gargantext.Core (Lang(..))
import Gargantext.Core.NLP (HasNLPServer)
import Gargantext.Core.NodeStory.Types (HasNodeStoryEnv, HasNodeArchiveStoryImmediateSaver)
import Gargantext.Core.NodeStory.Types (HasNodeStoryEnv)
import Gargantext.Core.Text.Corpus.Parsers.Date (mDateSplit)
import Gargantext.Core.Text.Terms (TermType(..))
import Gargantext.Core.Types.Individu (User(..))
......@@ -39,7 +39,7 @@ import Gargantext.Database.Action.Flow.Types ( FlowCmdM )
import Gargantext.Database.Admin.Types.Hyperdata.Corpus ( HyperdataCorpus )
import Gargantext.Database.Admin.Types.Hyperdata.Document (HyperdataDocument(..))
import Gargantext.Database.Admin.Types.Node ( DocId, NodeId, NodeType(NodeCorpus), ParentId )
import Gargantext.Database.Prelude (IsDBCmd)
import Gargantext.Database.Prelude
import Gargantext.Database.Query.Table.Node.Error (HasNodeError)
import Gargantext.Database.Query.Table.Node (getClosestParentIdByType')
import Gargantext.Database.Schema.Node (_node_hyperdata)
......@@ -71,7 +71,7 @@ documentUpload :: (FlowCmdM env err m)
-> DocumentUpload
-> m [DocId]
documentUpload nId doc = do
mcId <- getClosestParentIdByType' nId NodeCorpus
mcId <- runDBQuery $ getClosestParentIdByType' nId NodeCorpus
let cId = case mcId of
Just c -> c
Nothing -> panicTrace $ T.pack $ "[G.A.N.DU] Node has no corpus parent: " <> show nId
......@@ -107,8 +107,7 @@ documentUpload nId doc = do
-- only compatible versions.
remoteImportDocuments :: ( HasNodeError err
, HasNLPServer env
, HasNodeArchiveStoryImmediateSaver env
, HasNodeStoryEnv env
, HasNodeStoryEnv env err
, IsDBCmd env err m
, MonadLogger m
, MonadIO m)
......
......@@ -28,7 +28,7 @@ import Gargantext.API.Prelude (GargM)
import Gargantext.API.Routes.Named.Document qualified as Named
import Gargantext.API.Worker (serveWorkerAPI)
import Gargantext.Core (Lang(..))
import Gargantext.Core.NodeStory (HasNodeStoryImmediateSaver, HasNodeArchiveStoryImmediateSaver, currentVersion)
import Gargantext.Core.NodeStory (currentVersion, hasNodeStory)
import Gargantext.Core.Text.Corpus.Parsers.Date (split')
import Gargantext.Core.Text.Corpus.Parsers.FrameWrite (Author(..), Parsed(..), parseLines, text2titleParagraphs)
import Gargantext.Core.Text.Terms (TermType(..))
......@@ -41,11 +41,13 @@ import Gargantext.Database.Admin.Types.Hyperdata.Frame ( HyperdataFrame(..), get
import Gargantext.Database.Admin.Types.Node ( NodeId, Node, NodeType(..) )
import Gargantext.Database.Query.Table.Node (getChildrenByType, getClosestParentIdByType', getNodeWith, getOrMkList)
import Gargantext.Database.Schema.Node (node_hyperdata, node_name, node_date)
import Gargantext.Database.Prelude
import Gargantext.Prelude
import Gargantext.System.Logging (logLocM, LogLevel(..))
import Gargantext.Utils.Jobs.Monad (MonadJobStatus(..))
import Gargantext.Utils.Jobs.Error (HumanFriendlyErrorText(..))
import Servant.Server.Generic (AsServerT)
import Control.Lens (view)
api :: AuthenticatedUser
-- ^ The logged-in user
......@@ -61,8 +63,7 @@ api authenticatedUser nId =
documentsFromWriteNodes :: ( FlowCmdM env err m
, MonadJobStatus m
, HasNodeStoryImmediateSaver env
, HasNodeArchiveStoryImmediateSaver env )
)
=> AuthenticatedUser
-- ^ The logged-in user
-> NodeId
......@@ -73,7 +74,7 @@ documentsFromWriteNodes authenticatedUser nId Params { selection, lang, paragrap
markStarted 2 jobHandle
markProgress 1 jobHandle
mcId <- getClosestParentIdByType' nId NodeCorpus
mcId <- runDBQuery $ getClosestParentIdByType' nId NodeCorpus
cId <- case mcId of
Just cId -> pure cId
Nothing -> do
......@@ -82,10 +83,10 @@ documentsFromWriteNodes authenticatedUser nId Params { selection, lang, paragrap
markFailed (Just $ UnsafeMkHumanFriendlyErrorText "The requested node has no corpus parent.") jobHandle
panicTrace msg
frameWriteIds <- getChildrenByType nId Notes
frameWriteIds <- runDBQuery $ getChildrenByType nId Notes
-- https://write.frame.gargantext.org/<frame_id>/download
frameWrites <- mapM (\id -> getNodeWith id (Proxy :: Proxy HyperdataFrame)) frameWriteIds
frameWrites <- mapM (\id -> runDBQuery $ getNodeWith id (Proxy :: Proxy HyperdataFrame)) frameWriteIds
frameWritesWithContents <- liftBase $
mapM (\node -> do
......@@ -107,9 +108,11 @@ documentsFromWriteNodes authenticatedUser nId Params { selection, lang, paragrap
-- FIXME(adn) If we were to store the UserID inside an 'AuthenticatedUser', we won't need this.
listId <- getOrMkList cId userId
v <- currentVersion listId
_ <- commitStatePatch listId (Versioned v mempty)
env <- view hasNodeStory
runDBTx $ do
listId <- getOrMkList cId userId
v <- currentVersion listId
void $ commitStatePatch env listId (Versioned v mempty)
markProgress 1 jobHandle
where
......
......@@ -34,10 +34,13 @@ import Gargantext.Database.GargDB qualified as GargDB
import Gargantext.Database.Query.Table.Node (getNodeWith)
import Gargantext.Database.Query.Table.Node.UpdateOpaleye (updateHyperdata)
import Gargantext.Database.Schema.Node (node_hyperdata)
import Gargantext.Database.Prelude
import Gargantext.Prelude
import Gargantext.Utils.Jobs.Monad (MonadJobStatus(..))
import Servant
import Servant.Server.Generic (AsServerT)
import Control.Lens (view)
import Gargantext.Core.Config (hasConfig)
fileApi :: (FlowCmdM env err m)
......@@ -52,7 +55,7 @@ fileDownload nId = do
-- printDebug "[fileDownload] uId" uId
-- printDebug "[fileDownload] nId" nId
node <- getNodeWith nId (Proxy :: Proxy HyperdataFile)
node <- runDBQuery $ getNodeWith nId (Proxy :: Proxy HyperdataFile)
let (HyperdataFile { _hff_name = name'
, _hff_path = path }) = node ^. node_hyperdata
......@@ -92,24 +95,26 @@ addWithFile :: (FlowCmdM env err m, MonadJobStatus m)
-> JobHandle m
-> m ()
addWithFile authenticatedUser nId nwf@(NewWithFile _d _l fName) jobHandle = do
cfg <- view hasConfig
-- printDebug "[addWithFile] Uploading file: " nId
markStarted 1 jobHandle
fPath <- GargDB.writeFile nwf
-- printDebug "[addWithFile] File saved as: " fPath
nIds <- mkNodeWithParent NodeFile (Just nId) userId fName
_ <- case nIds of
[nId'] -> do
runDBTx $ do
nIds <- mkNodeWithParent cfg NodeFile (Just nId) userId fName
case nIds of
[nId'] -> do
node <- getNodeWith nId' (Proxy :: Proxy HyperdataFile)
let hl = node ^. node_hyperdata
_ <- updateHyperdata nId' $ hl { _hff_name = fName
, _hff_path = T.pack fPath }
void $ updateHyperdata nId' $ hl { _hff_name = fName
, _hff_path = T.pack fPath }
-- printDebug "[addWithFile] Created node with id: " nId'
pure ()
_ -> pure ()
_ -> pure ()
-- printDebug "[addWithFile] File upload finished: " nId
markComplete jobHandle
......
......@@ -29,14 +29,13 @@ import Gargantext.API.Prelude ( GargM )
import Gargantext.API.Routes.Named.FrameCalc qualified as Named
import Gargantext.API.Worker (serveWorkerAPI)
import Gargantext.Core.Config (HasConfig)
import Gargantext.Core.NodeStory.Types ( HasNodeArchiveStoryImmediateSaver )
import Gargantext.Core.Types.Individu (User(..))
import Gargantext.Core.Worker.Jobs.Types qualified as Jobs
import Gargantext.Database.Action.Flow.Types ( FlowCmdM )
import Gargantext.Database.Admin.Types.Hyperdata.Frame ( HyperdataFrame(..) )
import Gargantext.Database.Admin.Types.Node ( NodeId, NodeType(NodeCorpus) )
import Gargantext.Database.Query.Table.Node (getClosestParentIdByType, getNodeWith)
import Gargantext.Database.Prelude (createLargeObject)
import Gargantext.Database.Prelude
import Gargantext.Database.Schema.Node (node_hyperdata)
import Gargantext.Prelude
import Gargantext.Utils.Jobs.Monad (MonadJobStatus(..), markFailureNoErr)
......@@ -60,7 +59,6 @@ frameCalcUploadAsync :: ( MonadMask m
, HasConfig env
, FlowCmdM env err m
, MonadJobStatus m
, HasNodeArchiveStoryImmediateSaver env
)
=> AuthenticatedUser
-- ^ The logged-in user
......@@ -74,7 +72,7 @@ frameCalcUploadAsync authenticatedUser nId (FrameCalcUpload _wtf_lang _wtf_selec
-- printDebug "[frameCalcUploadAsync] uId" uId
-- printDebug "[frameCalcUploadAsync] nId" nId
node <- getNodeWith nId (Proxy :: Proxy HyperdataFrame)
node <- runDBQuery $ getNodeWith nId (Proxy :: Proxy HyperdataFrame)
let (HyperdataFrame { _hf_base = base
, _hf_frame_id = frame_id }) = node ^. node_hyperdata
......@@ -89,7 +87,7 @@ frameCalcUploadAsync authenticatedUser nId (FrameCalcUpload _wtf_lang _wtf_selec
PSQL.Oid oId <- createLargeObject body
-- printDebug "body" body
mCId <- getClosestParentIdByType nId NodeCorpus
mCId <- runDBQuery $ getClosestParentIdByType nId NodeCorpus
-- printDebug "[frameCalcUploadAsync] mCId" mCId
case mCId of
......
......@@ -12,8 +12,6 @@ Async new node feature
-}
{-# LANGUAGE IncoherentInstances #-}
module Gargantext.API.Node.New
where
......@@ -25,13 +23,14 @@ import Gargantext.API.Node.New.Types (PostNode(..))
import Gargantext.API.Prelude (GargM)
import Gargantext.API.Routes.Named.Node qualified as Named
import Gargantext.API.Worker (serveWorkerAPI)
import Gargantext.Core.Config (hasConfig)
import Gargantext.Core.Mail.Types (HasMail)
import Gargantext.Core.Notifications.CentralExchange.Types qualified as CE
import Gargantext.Core.NLP (HasNLPServer)
import Gargantext.Core.Notifications.CentralExchange.Types qualified as CE
import Gargantext.Core.Worker.Jobs.Types qualified as Jobs
import Gargantext.Database.Action.Node (mkNodeWithParent)
import Gargantext.Database.Admin.Types.Node
import Gargantext.Database.Prelude (IsDBCmdExtra, DBCmdWithEnv)
import Gargantext.Database.Prelude
import Gargantext.Database.Query.Table.Node.Error (HasNodeError(..))
import Gargantext.Prelude
import Servant.Server.Generic (AsServerT)
......@@ -85,7 +84,8 @@ postNode' :: ( IsDBCmdExtra env err m
postNode' authenticatedUser nId (PostNode nodeName tn) = do
let userId = authenticatedUser ^. auth_user_id
nodeIds <- mkNodeWithParent tn (Just nId) userId nodeName
cfg <- view hasConfig
nodeIds <- runDBTx $ mkNodeWithParent cfg tn (Just nId) userId nodeName
-- mapM_ (CE.ce_notify . CE.UpdateTreeFirstLevel) nodeIds
CE.ce_notify $ CE.UpdateTreeFirstLevel nId
......
......@@ -18,6 +18,7 @@ import Gargantext.API.Routes.Named.Viz qualified as Named
import Gargantext.Core.Viz.Phylo.API.Tools (getPhyloData, phylo2dot, phylo2dot2json)
import Gargantext.Core.Viz.Phylo.Example (phyloCleopatre)
import Gargantext.Database.Admin.Types.Node (PhyloId, NodeId)
import Gargantext.Database.Prelude
import Gargantext.Prelude
import Servant
import Servant.Server.Generic (AsServerT)
......@@ -37,7 +38,7 @@ getPhyloJson :: NodeId
-> PhyloId
-> GargNoServer (Headers '[Header "Content-Disposition" T.Text] Value)
getPhyloJson _ pId = do
maybePhyloData <- getPhyloData pId
maybePhyloData <- runDBQuery $ getPhyloData pId
let phyloData = fromMaybe phyloCleopatre maybePhyloData
phyloJson <- liftBase $ phylo2dot2json phyloData
pure $ addHeader (T.concat [ "attachment; filename="
......@@ -51,7 +52,7 @@ getPhyloDot :: NodeId
-> PhyloId
-> GargNoServer (Headers '[Header "Content-Disposition" T.Text] T.Text)
getPhyloDot _ pId = do
maybePhyloData <- getPhyloData pId
maybePhyloData <- runDBQuery $ getPhyloData pId
let phyloData = fromMaybe phyloCleopatre maybePhyloData
phyloDot <- liftBase $ phylo2dot phyloData
pure $ addHeader (T.concat [ "attachment; filename="
......
......@@ -19,17 +19,19 @@ import Data.Text qualified as Text
import Gargantext.API.Node.Share.Types (ShareNodeParams(..))
import Gargantext.API.Prelude (IsGargServer)
import Gargantext.API.Routes.Named.Share qualified as Named
import Gargantext.Core.Notifications.CentralExchange.Types (CEMessage)
import Gargantext.Core.Types.Individu (User(..), arbitraryUsername)
import Gargantext.Database.Action.Share (ShareNodeWith(..))
import Gargantext.Database.Action.Share as DB (shareNodeWith, unshare)
import Gargantext.Database.Action.User (getUserId', getUsername)
import Gargantext.Database.Action.User.New (guessUserName, newUser)
import Gargantext.Database.Admin.Types.Node (NodeId, NodeType(..), UserId(..))
import Gargantext.Database.Prelude (IsDBCmdExtra)
import Gargantext.Database.Prelude
import Gargantext.Database.Query.Table.Node.Error (HasNodeError(..))
import Gargantext.Database.Query.Tree (findNodesWithType)
import Gargantext.Prelude
import Servant.Server.Generic (AsServerT)
import qualified Gargantext.Core.Notifications.CentralExchange.Types as CE
------------------------------------------------------------------------
-- TODO permission
......@@ -48,20 +50,20 @@ api userInviting nId (ShareTeamParams user') = do
user <- case guessUserName user'' of
Nothing -> pure user''
Just (u, _) -> do
isRegistered <- getUserId' (UserName u)
isRegistered <- runDBQuery $ getUserId' (UserName u)
case isRegistered of
Right _ -> do
-- printDebug "[G.A.N.Share.api]" ("Team shared with " <> u)
pure u
Left _err -> do
username' <- getUsername userInviting
username' <- runDBQuery (getUsername userInviting)
unless (username' `List.elem` arbitraryUsername) $ do
-- TODO better analysis of the composition of what is shared
children <- findNodesWithType nId [NodeList] [ NodeFolderShared
, NodeTeam
, NodeFolder
, NodeCorpus
]
children <- runDBQuery $ findNodesWithType nId [NodeList] [ NodeFolderShared
, NodeTeam
, NodeFolder
, NodeCorpus
]
_ <- if List.null children
then do
-- printDebug "[G.A.N.Share.api]" ("Invitation is enabled if you share a corpus at least" :: Text)
......@@ -72,11 +74,22 @@ api userInviting nId (ShareTeamParams user') = do
pure ()
pure u
fromIntegral <$> DB.shareNodeWith (ShareNodeWith_User NodeFolderShared (UserName user)) nId
fromIntegral <$> shareNodeAndNotify (shareNodeWith (ShareNodeWith_User NodeFolderShared (UserName user)) nId)
api _uId nId2 (SharePublicParams nId1) =
fromIntegral <$> DB.shareNodeWith (ShareNodeWith_Node NodeFolderPublic nId1) nId2
fromIntegral <$> shareNodeAndNotify (shareNodeWith (ShareNodeWith_Node NodeFolderPublic nId1) nId2)
shareNodeAndNotify :: ( HasNodeError err
, IsDBCmdExtra env err m
, MonadRandom m
)
=> DBUpdate err (Int, [CEMessage])
-> m Int
shareNodeAndNotify dbTx = do
(res, msgs) <- runDBTx dbTx
forM_ msgs CE.ce_notify
pure res
-- | Unshare a previously shared node via the /share endpoint.
unShare :: IsGargServer env err m => NodeId -> Named.UnshareNode (AsServerT m)
unShare = Named.UnshareNode . DB.unshare
unShare p = Named.UnshareNode (\n -> runDBTx $ DB.unshare p n)
......@@ -24,7 +24,7 @@ import Gargantext.API.Node.Update.Types (Method(..), UpdateNodeParams(..), Updat
import Gargantext.API.Prelude (GargM, simuLogs)
import Gargantext.API.Routes.Named.Node qualified as Named
import Gargantext.API.Worker (serveWorkerAPI)
import Gargantext.Core.NodeStory.Types (HasNodeStory)
import Gargantext.Core.NodeStory.Types (HasNodeStory, hasNodeStory)
import Gargantext.Core.Text.Ngrams (NgramsType(NgramsTerms))
import Gargantext.Core.Types.Main (ListType(..))
import Gargantext.Core.Viz.Graph.API (recomputeGraph)
......@@ -39,6 +39,7 @@ import Gargantext.Database.Admin.Types.Node ( NodeId, NodeType(NodeCorpus, NodeA
import Gargantext.Database.Query.Table.Node (defaultList, getNode, getChildrenByType, getNodeWith)
import Gargantext.Database.Query.Table.Node.UpdateOpaleye (updateHyperdata)
import Gargantext.Database.Schema.Node (node_parent_id, node_hyperdata)
import Gargantext.Database.Prelude
import Gargantext.Prelude
import Gargantext.System.Logging ( MonadLogger )
import Gargantext.Utils.Jobs.Monad (MonadJobStatus(..))
......@@ -54,6 +55,7 @@ api nId =
, _un_args = p }
}
-- TODO(adn) Make DB-transactional.
updateNode :: (HasNodeStory env err m
, MonadJobStatus m
, MonadLogger m
......@@ -73,11 +75,12 @@ updateNode nId (UpdateNodeParamsGraph
markComplete jobHandle
updateNode nid1 (LinkNodeReq nt nid2) jobHandle = do
env <- view hasNodeStory
markStarted 2 jobHandle
markProgress 1 jobHandle
_ <- case nt of
NodeAnnuaire -> pairing nid2 nid1 Nothing -- defaultList
NodeCorpus -> pairing nid1 nid2 Nothing -- defaultList
NodeAnnuaire -> runDBTx $ pairing env nid2 nid1 Nothing -- defaultList
NodeCorpus -> runDBTx $ pairing env nid1 nid2 Nothing -- defaultList
_ -> panicTrace $ "[G.API.N.Update.updateNode] NodeType not implemented"
<> show nt <> " nid1: " <> show nid1 <> " nid2: " <> show nid2
......@@ -86,7 +89,7 @@ updateNode nid1 (LinkNodeReq nt nid2) jobHandle = do
-- | `Advanced` to update graphs
updateNode lId (UpdateNodeParamsList Advanced) jobHandle = do
markStarted 4 jobHandle
corpusId <- view node_parent_id <$> getNode lId
corpusId <- view node_parent_id <$> (runDBQuery $ getNode lId)
markProgress 1 jobHandle
......@@ -103,16 +106,17 @@ updateNode lId (UpdateNodeParamsList Advanced) jobHandle = do
markComplete jobHandle
updateNode lId (UpdateNodeParamsList _mode) jobHandle = do
env <- view hasNodeStory
markStarted 3 jobHandle
corpusId <- view node_parent_id <$> getNode lId
corpusId <- view node_parent_id <$> runDBQuery (getNode lId)
markProgress 1 jobHandle
_ <- case corpusId of
Just cId -> do
_ <- reIndexWith cId lId NgramsTerms (Set.singleton MapTerm)
_ <- runDBTx $ reIndexWith env cId lId NgramsTerms (Set.singleton MapTerm)
markProgress 1 jobHandle
_ <- updateNgramsOccurrences cId lId
_ <- runDBTx $ updateNgramsOccurrences env cId lId
pure ()
Nothing -> pure ()
......@@ -120,7 +124,7 @@ updateNode lId (UpdateNodeParamsList _mode) jobHandle = do
updateNode phyloId (UpdateNodePhylo config) jobHandle = do
markStarted 3 jobHandle
oldPhylo <- getNodeWith phyloId (Proxy @HyperdataPhylo)
oldPhylo <- runDBQuery $ getNodeWith phyloId (Proxy @HyperdataPhylo)
let corpusId' = view node_parent_id oldPhylo
let mbComputeHistory = oldPhylo ^? node_hyperdata . hp_data . traverse . phylo_computeTime . _Just
markProgress 1 jobHandle
......@@ -137,7 +141,7 @@ updateNode phyloId (UpdateNodePhylo config) jobHandle = do
, _scst_events = Just []
}
-}
_ <- timeMeasured "updateNode.updateHyperdataPhylo" $ updateHyperdata phyloId (HyperdataPhylo Nothing (Just phy))
_ <- timeMeasured "updateNode.updateHyperdataPhylo" $ runDBTx $ updateHyperdata phyloId (HyperdataPhylo Nothing (Just phy))
-- TODO: catch the error of sendMail if userId is not found, then debug
-- sendMail (UserDBId userId)
......@@ -145,7 +149,7 @@ updateNode phyloId (UpdateNodePhylo config) jobHandle = do
updateNode tId (UpdateNodeParamsTexts _mode) jobHandle = do
markStarted 2 jobHandle
corpusId <- view node_parent_id <$> getNode tId
corpusId <- view node_parent_id <$> (runDBQuery $ getNode tId)
markProgress 1 jobHandle
_ <- case corpusId of
......@@ -162,11 +166,11 @@ updateNode tId
markStarted 5 jobHandle
markProgress 1 jobHandle
_ <- getNode tId
childTexts <- getChildrenByType tId NodeTexts
childGraphs <- getChildrenByType tId NodeGraph
childPhylos <- getChildrenByType tId NodePhylo
childNodeLists <- getChildrenByType tId NodeList
_ <- runDBQuery (getNode tId)
childTexts <- runDBQuery $ getChildrenByType tId NodeTexts
childGraphs <- runDBQuery $ getChildrenByType tId NodeGraph
childPhylos <- runDBQuery $ getChildrenByType tId NodePhylo
childNodeLists <- runDBQuery $ getChildrenByType tId NodeList
mapM_ (\cId -> updateNode cId (UpdateNodeParamsTexts methodTexts) jobHandle) childTexts
markProgress 1 jobHandle
......@@ -189,15 +193,16 @@ updateDocs :: ( HasNodeStory env err m
-> JobHandle m
-> m ()
updateDocs cId jobHandle = do
env <- view hasNodeStory
markStarted 4 jobHandle
lId <- defaultList cId
_ <- reIndexWith cId lId NgramsTerms (Set.singleton MapTerm)
lId <- runDBQuery $ defaultList cId
_ <- runDBTx $ reIndexWith env cId lId NgramsTerms (Set.singleton MapTerm)
markProgress 1 jobHandle
_ <- updateNgramsOccurrences cId lId
_ <- runDBTx $ updateNgramsOccurrences env cId lId
markProgress 1 jobHandle
_ <- updateContextScore cId lId
_ <- runDBTx $ updateContextScore env cId lId
markProgress 1 jobHandle
_ <- Metrics.updateChart' cId lId NgramsTypes.Docs Nothing
_ <- runDBTx $ Metrics.updateChart' cId lId NgramsTypes.Docs Nothing
markProgress 1 jobHandle
-- printDebug "updateContextsScore" (cId, lId, u)
pure ()
......
......@@ -40,10 +40,10 @@ import Servant
authenticationError :: (MonadError e m, HasAuthenticationError e) => AuthenticationError -> m a
authenticationError = throwError . (_AuthenticationError #)
type EnvC env =
type EnvC env err =
( HasConnectionPool env
, HasConfig env
, HasNodeStoryEnv env
, HasNodeStoryEnv env err
, HasMail env
, HasNLPServer env
, HasManager env
......@@ -66,7 +66,7 @@ type GargServerC env err m =
, HasMail env
, MonadRandom m
, Safe.MonadCatch m
, EnvC env
, EnvC env err
, ErrC err
, ToJSON err
)
......@@ -82,7 +82,7 @@ class (MonadLogger m, GargServerC env err m) => IsGargServer env err m
type GargM env err = ReaderT env (ExceptT err IO)
-- This is the server type using GargM. It needs to be used as little as possible.
-- Instead, prefer GargServer, GargServerT.
type GargServerM env err api = (EnvC env, ErrC err) => ServerT api (GargM env err)
type GargServerM env err api = (EnvC env err, ErrC err) => ServerT api (GargM env err)
-------------------------------------------------------------------
-- | This Type is needed to prepare the function before the GargServer
......
......@@ -28,6 +28,7 @@ import Gargantext.Core.Types.Search (toRow)
import Gargantext.Database.Action.Flow.Pairing (isPairedWith)
import Gargantext.Database.Action.Search (searchInCorpus, searchInCorpusWithContacts)
import Gargantext.Database.Admin.Types.Node (NodeId, NodeType(..))
import Gargantext.Database.Prelude
import Gargantext.Prelude
import Gargantext.System.Logging (logLocM, LogLevel(..))
import Servant.Server.Generic (AsServerT)
......@@ -44,10 +45,10 @@ api nId = Named.SearchAPI $ \query o l order -> case query of
$(logLocM) DEBUG $ T.pack "New search started with query = " <> (getRawQuery rawQuery)
SearchResult <$> SearchResultDoc
<$> map (toRow nId)
<$> searchInCorpus nId False q o l order
<$> runDBQuery (searchInCorpus nId False q o l order)
(SearchQuery rawQuery SearchContact) -> case parseQuery rawQuery of
Left err -> pure $ SearchResult $ SearchNoResult (T.pack err)
Right q -> do
Right q -> runDBQuery $ do
-- printDebug "isPairedWith" nId
aIds <- isPairedWith nId NodeAnnuaire
-- TODO if paired with several corpus
......
......@@ -6,7 +6,7 @@ module Gargantext.API.Server.Named.Ngrams (
, tableNgramsPostChartsAsync
) where
import Control.Lens ((%%~))
import Control.Lens ((%%~), view)
import Data.Map.Strict qualified as Map
import Data.Set qualified as Set
import Gargantext.API.Admin.Auth.Types (AuthenticatedUser, PathId (..))
......@@ -19,27 +19,26 @@ import Gargantext.API.Ngrams.Types
import Gargantext.API.Prelude (GargM)
import Gargantext.API.Routes.Named.Table qualified as Named
import Gargantext.API.Worker (serveWorkerAPI)
import Gargantext.Core.NodeStory.Types (HasNodeStory)
import Gargantext.Core.NodeStory.Types (HasNodeStory, NodeStoryEnv, hasNodeStory, HasNodeStoryEnv)
import Gargantext.Core.Types (DocId, ListId, ListType(..), NodeId, NodeType(..))
import Gargantext.Core.Types.Query (Limit(..), Offset(..))
import Gargantext.Core.Worker.Jobs.Types qualified as Jobs
import Gargantext.Database.Admin.Config (userMaster)
import Gargantext.Database.Query.Table.Ngrams ( selectNgramsByDoc )
import Gargantext.Database.Query.Table.Node.Error (HasNodeError)
import Gargantext.Database.Query.Table.Node (getNode)
import Gargantext.Database.Query.Table.Node.Select ( selectNodesWithUsername )
import Gargantext.Database.Schema.Node (node_id, node_parent_id, node_user_id)
import Gargantext.Database.Prelude
import Gargantext.Prelude
import Gargantext.System.Logging
import Gargantext.Utils.Jobs.Monad (JobHandle, MonadJobStatus(..), markFailedNoErr)
import Servant.Server.Generic (AsServerT)
apiNgramsTableCorpus :: NodeId -> Named.TableNgramsAPI (AsServerT (GargM Env BackendInternalError))
apiNgramsTableCorpus cId = Named.TableNgramsAPI
{ tableNgramsGetAPI = Named.TableNgramsApiGet $ getTableNgramsCorpus cId
{ tableNgramsGetAPI = Named.TableNgramsApiGet $ getTableNgramsCorpusHandler cId
, tableNgramsPutAPI = Named.TableNgramsApiPut $ tableNgramsPut
, recomputeScoresEp = Named.RecomputeScoresNgramsApiGet $ scoresRecomputeTableNgrams cId
, recomputeScoresEp = Named.RecomputeScoresNgramsApiGet $ scoresRecomputeTableNgramsHandler cId
, tableNgramsGetVersionEp = Named.TableNgramsApiGetVersion $ getTableNgramsVersion cId
, tableNgramsAsyncAPI = apiNgramsAsync cId
}
......@@ -49,21 +48,20 @@ apiNgramsTableDoc :: AuthenticatedUser
-> DocId
-> Named.TableNgramsAPI (AsServerT (GargM Env BackendInternalError))
apiNgramsTableDoc uid dId = withNamedAccess uid (PathNode dId) $ Named.TableNgramsAPI
{ tableNgramsGetAPI = Named.TableNgramsApiGet $ getTableNgramsDoc dId
{ tableNgramsGetAPI = Named.TableNgramsApiGet $ getTableNgramsDocHandler dId
, tableNgramsPutAPI = Named.TableNgramsApiPut tableNgramsPut
, recomputeScoresEp = Named.RecomputeScoresNgramsApiGet $ scoresRecomputeTableNgrams dId
, recomputeScoresEp = Named.RecomputeScoresNgramsApiGet $ scoresRecomputeTableNgramsHandler dId
, tableNgramsGetVersionEp = Named.TableNgramsApiGetVersion $ getTableNgramsVersion dId
, tableNgramsAsyncAPI = apiNgramsAsync dId
}
getTableNgramsVersion :: ( HasNodeStory env err m
, HasNodeError err )
getTableNgramsVersion :: (IsDBCmd err env m, HasNodeStoryEnv err env)
=> NodeId
-> TabType
-> ListId
-> m Version
getTableNgramsVersion _nId _tabType listId = currentVersion listId
getTableNgramsVersion _nId _tabType listId = runDBQuery $ currentVersion listId
apiNgramsAsync :: NodeId -> Named.TableNgramsAsyncAPI (AsServerT (GargM Env BackendInternalError))
......@@ -83,7 +81,7 @@ tableNgramsPostChartsAsync utn jobHandle = do
let tabType = utn ^. utn_tab_type
let listId = utn ^. utn_list_id
node <- getNode listId
node <- runDBQuery $ getNode listId
let _nId = node ^. node_id
_uId = node ^. node_user_id
mCId = node ^. node_parent_id
......@@ -150,11 +148,15 @@ tableNgramsPostChartsAsync utn jobHandle = do
}
-}
scoresRecomputeTableNgrams :: forall env err m.
( HasNodeStory env err m, HasNodeError err, MonadLogger m )
=> NodeId -> TabType -> ListId -> m Int
scoresRecomputeTableNgrams nId tabType listId = do
tableMap <- getNgramsTableMap listId ngramsType
scoresRecomputeTableNgramsHandler :: (IsDBCmd err env m, HasNodeStoryEnv err env)
=> NodeId -> TabType -> ListId -> m Int
scoresRecomputeTableNgramsHandler nId tabType listId = do
env <- view hasNodeStory
runDBQuery $ scoresRecomputeTableNgrams env nId tabType listId
scoresRecomputeTableNgrams :: NodeStoryEnv err -> NodeId -> TabType -> ListId -> DBQuery err x Int
scoresRecomputeTableNgrams env nId tabType listId = do
tableMap <- getNgramsTableMap env listId ngramsType
_ <- tableMap & v_data %%~ (setNgramsTableScores nId listId ngramsType)
. Map.mapWithKey ngramsElementFromRepo
......@@ -162,19 +164,28 @@ scoresRecomputeTableNgrams nId tabType listId = do
where
ngramsType = ngramsTypeFromTabType tabType
getTableNgramsDocHandler :: (IsDBCmd err env m, HasNodeStoryEnv err env)
=> DocId -> TabType
-> ListId -> Limit -> Maybe Offset
-> Maybe ListType
-> Maybe MinSize -> Maybe MaxSize
-> Maybe OrderBy
-> Maybe Text -- full text search
-> m (VersionedWithCount NgramsTable)
getTableNgramsDocHandler dId tabType listId limit_ offset listType minSize maxSize orderBy _mt = do
env <- view hasNodeStory
runDBQuery $ getTableNgramsDoc env dId tabType listId limit_ offset listType minSize maxSize orderBy _mt
-- | Text search is deactivated for now for ngrams by doc only
getTableNgramsDoc :: ( HasNodeStory env err m
, HasNodeError err
, MonadLogger m
)
=> DocId -> TabType
getTableNgramsDoc :: NodeStoryEnv err
-> DocId -> TabType
-> ListId -> Limit -> Maybe Offset
-> Maybe ListType
-> Maybe MinSize -> Maybe MaxSize
-> Maybe OrderBy
-> Maybe Text -- full text search
-> m (VersionedWithCount NgramsTable)
getTableNgramsDoc dId tabType listId limit_ offset listType minSize maxSize orderBy _mt = do
-> DBQuery err x (VersionedWithCount NgramsTable)
getTableNgramsDoc env dId tabType listId limit_ offset listType minSize maxSize orderBy _mt = do
ns <- selectNodesWithUsername NodeList userMaster
let ngramsType = ngramsTypeFromTabType tabType
ngs <- selectNgramsByDoc (ns <> [listId]) dId ngramsType
......@@ -188,6 +199,6 @@ getTableNgramsDoc dId tabType listId limit_ offset listType minSize maxSize orde
, _nsq_orderBy = orderBy
, _nsq_searchQuery = searchQueryFn
}
getTableNgrams dId listId tabType searchQuery
getTableNgrams env dId listId tabType searchQuery
......@@ -17,7 +17,7 @@ import Gargantext.Database.Admin.Types.Hyperdata.Corpus ( hc_fields )
import Gargantext.Database.Admin.Types.Hyperdata.CorpusField
import Gargantext.Database.Admin.Types.Hyperdata.Folder ( HyperdataFolder )
import Gargantext.Database.Admin.Types.Node ( NodeId(..), Node, unNodeId )
import Gargantext.Database.Prelude (DBCmd, DBCmdExtra)
import Gargantext.Database.Prelude
import Gargantext.Database.Query.Table.Node.Error (HasNodeError(..))
import Gargantext.Database.Query.Table.NodeNode (selectPublicNodes)
import Gargantext.Database.Schema.Node ( NodePoly(..), node_date, node_hyperdata ) -- (NodePoly(..))
......@@ -33,14 +33,14 @@ serverPublicGargAPI baseUrl = Named.GargPublicAPI $
}
api_home :: IsGargServer env err m => Text -> Named.HomeAPI (AsServerT m)
api_home baseUrl = Named.HomeAPI $ catMaybes
api_home baseUrl = Named.HomeAPI $ runDBQuery $ catMaybes
<$> map (toPublicData baseUrl)
<$> filterPublicDatas
<$> selectPublic
api_node :: IsGargServer env err m => NodeId -> Named.FileAPI (AsServerT m)
api_node nId = Named.FileAPI $ do
pubNodes <- publicNodes
pubNodes <- runDBQuery publicNodes
-- TODO optimize with SQL
case Set.member nId pubNodes of
False -> serverError $ err405 { errBody = "Not allowed" }
......@@ -50,7 +50,7 @@ api_node nId = Named.FileAPI $ do
selectPublic :: HasNodeError err
=> DBCmd err [( Node HyperdataFolder, Maybe Int)]
=> DBQuery err x [( Node HyperdataFolder, Maybe Int)]
selectPublic = selectPublicNodes
-- For tests only
......@@ -68,7 +68,7 @@ filterPublicDatas datas =
& Map.elems
publicNodes :: HasNodeError err
=> DBCmdExtra err (Set NodeId)
=> DBQuery err x (Set NodeId)
publicNodes = do
candidates <- filterPublicDatas <$> selectPublicNodes
pure $ Set.fromList
......
......@@ -16,7 +16,7 @@ import Conduit
import Control.Exception.Safe qualified as Safe
import Control.Exception (toException)
import Control.Lens (view, (#), (^.))
import Control.Monad.Except (throwError, MonadError)
import Control.Monad.Except (throwError)
import Control.Monad (void, forM_)
import Data.Aeson qualified as JSON
import Data.Aeson.Types qualified as JS
......@@ -46,14 +46,14 @@ import Gargantext.Core.Config
import Gargantext.Core.Config.Types (f_write_url)
import Gargantext.Core (lookupDBid)
import Gargantext.Core.NLP (HasNLPServer)
import Gargantext.Core.NodeStory.Types (HasNodeStoryEnv, HasNodeArchiveStoryImmediateSaver)
import Gargantext.Core.NodeStory.Types (HasNodeStoryEnv (..))
import Gargantext.Core.Types.Main
import Gargantext.Core.Worker.Jobs (sendJob)
import Gargantext.Core.Worker.Jobs.Types qualified as Jobs
import Gargantext.Database.Admin.Types.Hyperdata.Default (DefaultHyperdata(..))
import Gargantext.Database.Admin.Types.Hyperdata.Frame (HyperdataFrame(..))
import Gargantext.Database.Admin.Types.Node hiding (ERROR, WARNING, INFO)
import Gargantext.Database.Prelude (IsDBCmd)
import Gargantext.Database.Prelude
import Gargantext.Database.Query.Table.Node.Error (HasNodeError, nodeError, NodeError (..))
import Gargantext.Database.Query.Table.Node (insertNodeWithHyperdata, getNodes, getUserRootPrivateNode)
import Gargantext.Database.Query.Table.Node qualified as DB
......@@ -95,7 +95,7 @@ renderExportableNode = \case
instance Serialise ExportableNode where
remoteExportAPI :: (MonadIO m, IsGargServer env BackendInternalError m, HasNodeArchiveStoryImmediateSaver env)
remoteExportAPI :: (MonadIO m, IsGargServer env BackendInternalError m)
=> NodeId
-> AuthenticatedUser
-> Named.RemoteExportAPI (AsServerT m)
......@@ -105,7 +105,7 @@ remoteExportAPI nodeId authenticatedUser =
withPolicy authenticatedUser (remoteExportChecks nodeId) (remoteExportHandler nodeId authenticatedUser payload) mgr
}
remoteImportAPI :: (MonadIO m, IsGargServer env BackendInternalError m, HasNodeArchiveStoryImmediateSaver env)
remoteImportAPI :: (MonadIO m, IsGargServer env BackendInternalError m)
=> AuthenticatedUser
-> Named.RemoteImportAPI (AsServerT m)
remoteImportAPI authenticatedUser =
......@@ -115,10 +115,9 @@ remoteImportAPI authenticatedUser =
type ExpectedPayload = Tree ExportableNode
remoteImportHandler :: forall err env m.
( HasNodeStoryEnv env
( HasNodeStoryEnv env err
, HasNodeError err
, HasBackendInternalError err
, HasNodeArchiveStoryImmediateSaver env
, IsDBCmd env err m
, HasNLPServer env
, MonadLogger m
......@@ -138,7 +137,7 @@ remoteImportHandler loggedInUser c = do
$(logLocM) INFO $ "Importing " <> renderExportableNode x
-- NOTE(adn) By default, we append the imported node(s) to the user's
-- private folder.
privateFolderId <- _node_id <$> getUserRootPrivateNode (_auth_user_id loggedInUser)
privateFolderId <- _node_id <$> runDBQuery (getUserRootPrivateNode (_auth_user_id loggedInUser))
$(logLocM) INFO $ "Attaching " <> renderExportableNode x <> " to private folder " <> T.pack (show privateFolderId)
-- Attempts to insert nodes a we go along.
rootNode <- insertNode (Just privateFolderId) x
......@@ -168,14 +167,14 @@ remoteImportHandler loggedInUser c = do
cfg <- view hasConfig
newHyperdataFrame <- importNote mgr noteAsMarkdown cfg
-- TODO(adn) Import with the valid name.
new_node <- DB.insertNode Notes (Just "Imported note")
(Just $ DefaultFrameCode newHyperdataFrame) parentId (_auth_user_id loggedInUser)
new_node <- runDBTx $ DB.insertNode Notes (Just "Imported note")
(Just $ DefaultFrameCode newHyperdataFrame) parentId (_auth_user_id loggedInUser)
pure new_node
EN_document x docsList -> case lookupDBid $ _node_typename x of
Nothing -> throwError $ _BackendInternalError # InternalUnexpectedError (toException $ userError $ "remoteImportHandler: impossible, node with invalid type.")
Just ty -> do
new_node <- insertNodeWithHyperdata ty (_node_name x) (_node_hyperdata x) mb_parent (_auth_user_id loggedInUser)
new_node <- runDBTx $ insertNodeWithHyperdata ty (_node_name x) (_node_hyperdata x) mb_parent (_auth_user_id loggedInUser)
$(logLocM) INFO $ "Created a new node " <> T.pack (show $ new_node) <> " of type " <> T.pack (show ty)
for_ mb_parent $ \parentId -> do
$(logLocM) INFO $ "Found document list to import..."
......@@ -192,7 +191,7 @@ remoteImportHandler loggedInUser c = do
EN_terms x ngramsList -> case lookupDBid $ _node_typename x of
Nothing -> throwError $ _BackendInternalError # InternalUnexpectedError (toException $ userError $ "remoteImportHandler: impossible, node with invalid type.")
Just ty -> do
new_node <- insertNodeWithHyperdata ty (_node_name x) (_node_hyperdata x) mb_parent (_auth_user_id loggedInUser)
new_node <- runDBTx $ insertNodeWithHyperdata ty (_node_name x) (_node_hyperdata x) mb_parent (_auth_user_id loggedInUser)
$(logLocM) INFO $ "Created a new node " <> T.pack (show $ new_node) <> " of type " <> T.pack (show ty)
$(logLocM) INFO $ "Found ngrams list to import..."
void $ sendJob $ Jobs.ImportRemoteTerms $ Jobs.ImportRemoteTermsPayload new_node ngramsList
......@@ -202,7 +201,7 @@ remoteImportHandler loggedInUser c = do
insertSimple mb_parent x = case lookupDBid $ _node_typename x of
Nothing -> throwError $ _BackendInternalError # InternalUnexpectedError (toException $ userError $ "remoteImportHandler: impossible, node with invalid type.")
Just ty -> do
new_node <- insertNodeWithHyperdata ty (_node_name x) (_node_hyperdata x) mb_parent (_auth_user_id loggedInUser)
new_node <- runDBTx $ insertNodeWithHyperdata ty (_node_name x) (_node_hyperdata x) mb_parent (_auth_user_id loggedInUser)
$(logLocM) INFO $ "Created a new node " <> T.pack (show $ new_node) <> " of type " <> T.pack (show ty)
pure new_node
......@@ -220,8 +219,10 @@ remoteExportHandler :: ( MonadIO m, Safe.MonadCatch m
-> m [NodeId]
remoteExportHandler _rer_node_id loggedInUser Named.RemoteExportRequest{..} = do
mgr <- view gargHttpManager
nodes <- getNodes _rer_node_id
checkNodesTypeAllowed nodes
nodes <- runDBQuery $ do
ns <- getNodes _rer_node_id
checkNodesTypeAllowed ns
pure ns
exportable <- makeExportable (_auth_node_id loggedInUser) nodes
liftIO (withClientM (remoteImportClient _rer_instance_auth (streamEncoder exportable)) (mkClientEnv mgr _rer_instance_url) streamDecode)
`Safe.catch` \(e :: BackendInternalError) -> throwError $ _BackendInternalError # e
......@@ -233,12 +234,13 @@ makeExportable :: (MonadIO m, IsGargServer err env m)
makeExportable userNodeId (TreeN x xs)
| Just nty <- lookupDBid (_node_typename x)
= do
env <- view hasNodeStory
exportableRoot <- case nty of
NodeCorpus -> EN_corpus <$> pure x
NodeGraph -> EN_graph <$> pure x
NodePhylo -> EN_phylo <$> pure x
NodeTexts -> EN_document <$> pure x <*> get_document_json userNodeId (_node_id x)
NodeList -> EN_terms <$> pure x <*> getNgramsList (_node_id x)
NodeList -> EN_terms <$> pure x <*> runDBQuery (getNgramsList env (_node_id x))
Notes -> case JS.parseMaybe JS.parseJSON (_node_hyperdata x) of
Nothing
-> mk_err " invalid HyperdataFrame inside."
......@@ -315,12 +317,12 @@ appendPath t r = case List.last t of
'/' -> t <> List.tail r
_ -> t <> r
checkNodesTypeAllowed :: (MonadError e m, HasNodeError e) => Tree (Node a) -> m ()
checkNodesTypeAllowed :: HasNodeError e => Tree (Node a) -> DBQuery e x ()
checkNodesTypeAllowed (TreeN r xs) = do
checkNodeTypeAllowed r
mapM_ checkNodesTypeAllowed xs
checkNodeTypeAllowed :: (MonadError e m, HasNodeError e) => Node a -> m ()
checkNodeTypeAllowed :: HasNodeError e => Node a -> DBQuery e x ()
checkNodeTypeAllowed n
| Just nty <- lookupDBid (_node_typename n)
, nty `elem` exportableNodeTypes
......
......@@ -3,15 +3,18 @@ module Gargantext.API.Server.Named.Viz (
graphAPI
) where
import Control.Lens (view)
import Gargantext.API.Admin.Auth (withNamedAccess)
import Gargantext.API.Admin.Auth.Types (AuthenticatedUser, PathId(..))
import Gargantext.API.Admin.EnvTypes (Env)
import Gargantext.API.Errors.Types ( BackendInternalError )
import Gargantext.API.Prelude (GargM)
import Gargantext.API.Routes.Named.Viz qualified as Named
import Gargantext.Core.Config (hasConfig)
import Gargantext.Core.NodeStory.Types (hasNodeStory)
import Gargantext.Core.Viz.Graph.API
-- (cooc2graph)
import Gargantext.Database.Admin.Types.Node (NodeId, UserId)
import Gargantext.Database.Prelude
import Gargantext.Prelude
import Servant.Server.Generic (AsServerT)
......@@ -20,15 +23,19 @@ graphAPI :: AuthenticatedUser -> UserId -> NodeId -> Named.GraphAPI (AsServerT (
graphAPI authenticatedUser userId n = withNamedAccess authenticatedUser (PathNode n) $ Named.GraphAPI
{ getGraphEp = getGraph n
, getGraphAsyncEp = graphAsync n
, cloneGraphEp = graphClone userId n
, cloneGraphEp = \grAPI -> do
cfg <- view hasConfig
runDBTx $ graphClone cfg userId n grAPI
, gexfEp = getGraphGexf n
, graphVersionsAPI = graphVersionsAPI userId n
, updateGraphLegendEp = updateGraphLegend n
, updateGraphLegendEp = runDBTx . updateGraphLegend n
}
graphVersionsAPI :: UserId -> NodeId -> Named.GraphVersionsAPI (AsServerT (GargM Env BackendInternalError))
graphVersionsAPI u n = Named.GraphVersionsAPI
{ getGraphVersionsEp = graphVersions u n
{ getGraphVersionsEp = do
env <- view hasNodeStory
runDBTx $ graphVersions env u n
, recomputeGraphVersionEp = recomputeVersions n
}
......@@ -41,7 +41,7 @@ import Gargantext.Core.Types.Query (Offset, Limit)
import Gargantext.Database.Action.Learn (FavOrTrash(..), moreLike)
import Gargantext.Database.Action.Search (searchCountInCorpus, searchInCorpus)
import Gargantext.Database.Admin.Types.Node (ContactId, CorpusId, NodeId)
import Gargantext.Database.Prelude (IsDBCmdExtra, IsDBCmd, DBCmd)
import Gargantext.Database.Prelude
import Gargantext.Database.Query.Facet (FacetDoc , runViewDocuments, runCountDocuments, OrderBy(..), runViewAuthorsDoc)
import Gargantext.Database.Query.Table.Node.Error (HasNodeError)
import Gargantext.Prelude
......@@ -81,7 +81,7 @@ getTableApi cId tabType mLimit mOffset mOrderBy mQuery mYear =
where
get_table = do
$(logLocM) DEBUG $ "getTable cId = " <> T.pack (show cId)
t <- getTable cId tabType mOffset mLimit mOrderBy mQuery mYear
t <- runDBQuery $ getTable cId tabType mOffset mLimit mOrderBy mQuery mYear
pure $ constructHashedResponse t
postTableApi :: (IsDBCmdExtra env err m, MonadLogger m, HasNodeError err)
......@@ -91,7 +91,7 @@ postTableApi :: (IsDBCmdExtra env err m, MonadLogger m, HasNodeError err)
postTableApi cId tq = case tq of
TableQuery o l order ft "" -> do
$(logLocM) DEBUG $ "New search with no query"
getTable cId (Just ft) (Just o) (Just l) (Just order) Nothing Nothing
runDBQuery $ getTable cId (Just ft) (Just o) (Just l) (Just order) Nothing Nothing
TableQuery o l order ft q -> case ft of
Docs -> do
$(logLocM) DEBUG $ "New search with query " <> getRawQuery q
......@@ -121,7 +121,7 @@ searchInCorpus' cId t q o l order = do
Left noParseErr -> do
$(logLocM) ERROR $ "Invalid input query " <> (getRawQuery q) <> " , error = " <> (T.pack noParseErr)
pure $ TableResult 0 []
Right boolQuery -> do
Right boolQuery -> runDBQuery $ do
docs <- searchInCorpus cId t boolQuery o l order
countAllDocs <- searchCountInCorpus cId t (Just boolQuery)
pure $ TableResult { tr_docs = docs
......@@ -136,7 +136,7 @@ getTable :: HasNodeError err
-> Maybe OrderBy
-> Maybe RawQuery
-> Maybe Text
-> DBCmd err FacetTableResult
-> DBQuery err x FacetTableResult
getTable cId ft o l order raw_query year = do
docs <- getTable' cId ft o l order query year
docsCount <- runCountDocuments cId (ft == Just Trash) query year
......@@ -152,7 +152,7 @@ getTable' :: HasNodeError err
-> Maybe OrderBy
-> Maybe Text
-> Maybe Text
-> DBCmd err [FacetDoc]
-> DBQuery err x [FacetDoc]
getTable' cId ft o l order query year =
case ft of
(Just Docs) -> runViewDocuments cId False o l order query year
......@@ -162,10 +162,12 @@ getTable' cId ft o l order query year =
x -> panicTrace $ "not implemented in getTable: " <> (show x)
getPair :: ContactId -> Maybe TabType
-> Maybe Offset -> Maybe Limit
-> Maybe OrderBy -> DBCmd err [FacetDoc]
getPair cId ft o l order =
getPair :: IsDBCmd env err m
=> ContactId
-> Maybe TabType
-> Maybe Offset -> Maybe Limit
-> Maybe OrderBy -> m [FacetDoc]
getPair cId ft o l order = runDBQuery $ do
case ft of
(Just Docs) -> runViewAuthorsDoc cId False o l order
(Just Trash) -> runViewAuthorsDoc cId True o l order
......
This diff is collapsed.
......@@ -12,6 +12,8 @@ Portability : POSIX
{-# LANGUAGE Arrows #-}
{-# LANGUAGE ConstraintKinds #-}
{-# LANGUAGE QuasiQuotes #-}
{-# LANGUAGE TypeApplications #-}
{-# LANGUAGE ScopedTypeVariables #-}
module Gargantext.Core.NodeStory.DB
( nodeExists
......@@ -34,23 +36,22 @@ import Database.PostgreSQL.Simple.SqlQQ (sql)
import Database.PostgreSQL.Simple.Types (Values(..), QualifiedIdentifier(..))
import Gargantext.API.Ngrams.Types
import Gargantext.Core (toDBid)
import Gargantext.Core.NodeStory.Types ( a_state, a_version, ArchiveList, ArchiveStateList, NgramsStatePatch' )
import Gargantext.Core.NodeStory.Types ( a_state, a_version, ArchiveList, ArchiveStateList, NgramsStatePatch', ArchiveState )
import Gargantext.Core.Text.Ngrams (NgramsType)
import Gargantext.Database.Admin.Types.Node ( NodeId(..), NodeType )
import Gargantext.Database.Admin.Config ()
import Gargantext.Database.Schema.Ngrams ()
import Gargantext.Database.Prelude
import Gargantext.Prelude hiding (to)
import Gargantext.Prelude.Database ( runPGSExecute, runPGSExecuteMany, runPGSQuery, runPGSReturning )
nodeExists :: PGS.Connection -> NodeId -> IO Bool
nodeExists c nId = (== [PGS.Only True])
<$> runPGSQuery c [sql| SELECT true FROM nodes WHERE id = ? LIMIT 1 |]
(PGS.Only nId)
nodeExists :: NodeId -> DBQuery err x Bool
nodeExists nId = (== [PGS.Only True])
<$> mkPGQuery [sql| SELECT true FROM nodes WHERE id = ? LIMIT 1 |] (PGS.Only nId)
getNodesIdWithType :: PGS.Connection -> NodeType -> IO [NodeId]
getNodesIdWithType c nt = do
ns <- runPGSQuery c query (PGS.Only $ toDBid nt)
getNodesIdWithType :: NodeType -> DBQuery err x [NodeId]
getNodesIdWithType nt = do
ns <- mkPGQuery query (PGS.Only $ toDBid nt)
pure $ map (\(PGS.Only nId) -> UnsafeMkNodeId nId) ns
where
query :: PGS.Query
......@@ -61,12 +62,11 @@ getNodesIdWithType c nt = do
-- which depends on the Ngrams List Flow
-- Version > 5 is hard coded because by default
-- first version of history of manual change is 6
getNodesArchiveHistory :: PGS.Connection
-> [NodeId]
-> IO [(NodeId, (Map NgramsType [HashMap NgramsTerm NgramsPatch]))]
getNodesArchiveHistory c nodesId = do
as <- runPGSQuery c query (PGS.Only $ Values fields nodesId)
:: IO [(Int, NgramsType, NgramsTerm, NgramsPatch)]
getNodesArchiveHistory :: [NodeId]
-> DBQuery err x [(NodeId, (Map NgramsType [HashMap NgramsTerm NgramsPatch]))]
getNodesArchiveHistory nodesId = do
as <- mkPGQuery query (PGS.Only $ Values fields nodesId)
:: DBQuery err x [(Int, NgramsType, NgramsTerm, NgramsPatch)]
pure $ map (\(nId, ngramsType, terms, patch)
-> ( UnsafeMkNodeId nId
......@@ -87,19 +87,18 @@ getNodesArchiveHistory c nodesId = do
|]
insertNodeArchiveHistory :: PGS.Connection -> NodeId -> Version -> [NgramsStatePatch'] -> IO ()
insertNodeArchiveHistory _ _ _ [] = pure ()
insertNodeArchiveHistory c nodeId version (h:hs) = do
insertNodeArchiveHistory :: NodeId -> Version -> [NgramsStatePatch'] -> DBUpdate err ()
insertNodeArchiveHistory _ _ [] = pure ()
insertNodeArchiveHistory nodeId version (h:hs) = do
let tuples = mconcat $ (\(nType, NgramsTablePatch patch) ->
(\(term, p) ->
(nodeId, nType, term, p)) <$> PM.toList patch) <$> PM.toList h :: [(NodeId, NgramsType, NgramsTerm, NgramsPatch)]
tuplesM <- mapM (\(nId, nType, term, patch) -> do
[PGS.Only ngramsId] <- runPGSReturning c qInsert [PGS.Only term] :: IO [PGS.Only Int]
pure (nId, nType, ngramsId, term, patch)
) tuples :: IO [(NodeId, NgramsType, Int, NgramsTerm, NgramsPatch)]
_ <- runPGSExecuteMany c query $ ((\(nId, nType, termId, _term, patch) -> (nId, nType, termId, patch, version)) <$> tuplesM)
_ <- insertNodeArchiveHistory c nodeId version hs
pure ()
forM_ tuples $ \(nId, nType, term, patch) -> do
(ngramsId :: Int) <- PGS.fromOnly <$> mkPGUpdateReturningOne qInsert (PGS.Only term)
mkPGUpdate query (nId, nType, ngramsId, patch, version)
void $ insertNodeArchiveHistory nodeId version hs
where
qInsert :: PGS.Query
qInsert = [sql|INSERT INTO ngrams (terms) VALUES (?)
......@@ -112,7 +111,7 @@ insertNodeArchiveHistory c nodeId version (h:hs) = do
VALUES (?, ?, ?, ?, ?)
|]
nodeStoriesQuery :: PGS.Query
nodeStoriesQuery = [sql| SELECT version, ngrams_type_id, terms, ngrams_repo_element
FROM node_stories
......@@ -124,34 +123,34 @@ nodeStoriesQuery = [sql| SELECT version, ngrams_type_id, terms, ngrams_repo_elem
-- Archive
insertArchiveStateList :: PGS.Connection -> NodeId -> Version -> ArchiveStateList -> IO ()
insertArchiveStateList c nodeId version as = do
mapM_ performInsert as
insertArchiveStateList :: NodeId -> Version -> ArchiveStateList -> DBUpdate err ()
insertArchiveStateList nodeId version = mapM_ performInsert
where
performInsert :: ArchiveState -> DBUpdate err ()
performInsert (ngramsType, ngrams, ngramsRepoElement) = do
[PGS.Only ngramsId] <- tryInsertTerms ngrams
ngramsId <- tryInsertTerms ngrams
_ <- case ngramsRepoElement ^. nre_root of
Nothing -> pure []
Just r -> tryInsertTerms r
Just r -> (:[]) <$> tryInsertTerms r
mapM_ tryInsertTerms $ ngramsRepoElement ^. nre_children
runPGSExecute c query (nodeId, ngramsId, version, ngramsType, ngramsRepoElement)
tryInsertTerms :: NgramsTerm -> IO [PGS.Only Int]
tryInsertTerms t = runPGSReturning c qInsert [PGS.Only t]
void $ mkPGUpdate query (nodeId, ngramsId, version, ngramsType, ngramsRepoElement)
tryInsertTerms :: NgramsTerm -> DBUpdate err Int
tryInsertTerms t = PGS.fromOnly <$> mkPGUpdateReturningOne qInsert (PGS.Only t)
qInsert :: PGS.Query
qInsert = [sql|INSERT INTO ngrams (terms) VALUES (?)
ON CONFLICT (terms) DO UPDATE SET terms = excluded.terms
RETURNING id|]
query :: PGS.Query
query = [sql|INSERT INTO node_stories(node_id, ngrams_id, version, ngrams_type_id, ngrams_repo_element)
VALUES (?, ?, ?, ?, ? :: jsonb)
|]
deleteArchiveStateList :: PGS.Connection -> NodeId -> ArchiveStateList -> IO ()
deleteArchiveStateList c nodeId as = do
mapM_ (\(nt, n, _) -> runPGSExecute c query (nodeId, nt, n)) as
deleteArchiveStateList :: NodeId -> ArchiveStateList -> DBUpdate err ()
deleteArchiveStateList nodeId as = do
mapM_ (\(nt, n, _) -> mkPGUpdate query (nodeId, nt, n)) as
where
query :: PGS.Query
query = [sql| DELETE FROM node_stories
......@@ -159,10 +158,10 @@ deleteArchiveStateList c nodeId as = do
AND ngrams_id IN (SELECT id FROM ngrams WHERE terms = ?)
|]
updateArchiveStateList :: PGS.Connection -> NodeId -> Version -> ArchiveStateList -> IO ()
updateArchiveStateList c nodeId version as = do
updateArchiveStateList :: NodeId -> Version -> ArchiveStateList -> DBUpdate err ()
updateArchiveStateList nodeId version as = do
let params = (\(nt, n, nre) -> (nre, version, nodeId, nt, n)) <$> as
mapM_ (runPGSExecute c query) params
mapM_ (mkPGUpdate query) params
where
query :: PGS.Query
query = [sql| UPDATE node_stories
......@@ -172,10 +171,10 @@ updateArchiveStateList c nodeId version as = do
|]
updateNodeStoryVersion :: PGS.Connection -> NodeId -> ArchiveList -> IO ()
updateNodeStoryVersion c nodeId newArchive = do
updateNodeStoryVersion :: NodeId -> ArchiveList -> DBUpdate err ()
updateNodeStoryVersion nodeId newArchive = do
let ngramsTypes = Map.keys $ newArchive ^. a_state
mapM_ (\nt -> runPGSExecute c query (newArchive ^. a_version, nodeId, nt)) ngramsTypes
mapM_ (\nt -> mkPGUpdate query (newArchive ^. a_version, nodeId, nt)) ngramsTypes
where
query :: PGS.Query
query = [sql|UPDATE node_stories
......
......@@ -17,10 +17,6 @@ module Gargantext.Core.NodeStory.Types
( HasNodeStory
, HasNodeStoryEnv
, hasNodeStory
, HasNodeStoryImmediateSaver
, hasNodeStoryImmediateSaver
, HasNodeArchiveStoryImmediateSaver
, hasNodeArchiveStoryImmediateSaver
, NodeStory(..)
, NgramsState'
, NgramsStatePatch'
......@@ -30,8 +26,10 @@ module Gargantext.Core.NodeStory.Types
, initNodeStory
, nse_getter
, nse_getter_multi
, nse_saver_immediate
, nse_archive_saver_immediate
, nse_saver
, nse_archive_saver
, hasNodeStoryImmediateSaver
, hasNodeArchiveStoryImmediateSaver
-- , nse_var
, unNodeStory
, Archive(..)
......@@ -42,12 +40,13 @@ module Gargantext.Core.NodeStory.Types
, a_state
, a_version
, combineState
, ArchiveState
, ArchiveStateSet
, ArchiveStateList )
where
import Codec.Serialise.Class ( Serialise )
import Control.Lens (Getter)
import Control.Lens (Getter, Lens')
import Data.Aeson hiding ((.=), decode)
import Data.Map.Strict qualified as Map
import Data.Profunctor.Product.TH (makeAdaptorAndInstance)
......@@ -58,7 +57,7 @@ import Gargantext.Database.Admin.Types.Node ( NodeId(..) )
import Gargantext.Core.Text.Ngrams qualified as Ngrams
import Gargantext.Core.Utils.Prefix (unPrefix)
import Gargantext.Database.Admin.Config ()
import Gargantext.Database.Prelude (IsDBCmd)
import Gargantext.Database.Prelude
import Gargantext.Database.Query.Table.Node.Error (HasNodeError())
import Gargantext.Prelude hiding (to)
import Opaleye (DefaultFromField(..), SqlJsonb, fromPGSFromField)
......@@ -187,37 +186,22 @@ type ArchiveList = Archive NgramsState' NgramsStatePatch'
------------------------------------------------------------------------
data NodeStoryEnv = NodeStoryEnv
{ _nse_saver_immediate :: !(NodeId -> ArchiveList -> IO ())
, _nse_archive_saver_immediate :: !(NodeId -> ArchiveList -> IO ArchiveList)
, _nse_getter :: !(NodeId -> IO ArchiveList)
, _nse_getter_multi :: !([NodeId] -> IO NodeListStory)
data NodeStoryEnv err = NodeStoryEnv
{ _nse_saver :: !(NodeId -> ArchiveList -> DBUpdate err ())
, _nse_archive_saver :: !(NodeId -> ArchiveList -> DBUpdate err ArchiveList)
, _nse_getter :: !(forall x. NodeId -> DBQuery err x ArchiveList)
, _nse_getter_multi :: !(forall x. [NodeId] -> DBQuery err x NodeListStory)
--, _nse_cleaner :: !(IO ()) -- every 12 hours: cleans the repos of unused NodeStories
-- , _nse_lock :: !FileLock -- TODO (it depends on the option: if with database or file only)
}
deriving (Generic)
type HasNodeStory env err m = ( IsDBCmd env err m
, MonadReader env m
, MonadError err m
, HasNodeStoryEnv env
, HasNodeError err
)
class (HasNodeStoryImmediateSaver env)
=> HasNodeStoryEnv env where
hasNodeStory :: Getter env NodeStoryEnv
class HasNodeStoryImmediateSaver env where
hasNodeStoryImmediateSaver :: Getter env (NodeId -> ArchiveList -> IO ())
type HasNodeStory env err m = ( IsDBCmd env err m, HasNodeStoryEnv env err, HasNodeError err)
class HasNodeArchiveStoryImmediateSaver env where
hasNodeArchiveStoryImmediateSaver :: Getter env (NodeId -> ArchiveList -> IO ArchiveList)
class HasNodeStoryEnv env err where
hasNodeStory :: Getter env (NodeStoryEnv err)
type ArchiveStateList = [(Ngrams.NgramsType, NgramsTerm, NgramsRepoElement)]
type ArchiveState = (Ngrams.NgramsType, NgramsTerm, NgramsRepoElement)
type ArchiveStateList = [ArchiveState]
type ArchiveStateSet = Set.Set (Ngrams.NgramsType, NgramsTerm)
------------------------------------------------------------------------
......@@ -226,3 +210,9 @@ type ArchiveStateSet = Set.Set (Ngrams.NgramsType, NgramsTerm)
makeLenses ''NodeStoryEnv
makeLenses ''NodeStory
makeLenses ''Archive
hasNodeStoryImmediateSaver :: Lens' (NodeStoryEnv err) (NodeId -> ArchiveList -> DBUpdate err ())
hasNodeStoryImmediateSaver = nse_saver
hasNodeArchiveStoryImmediateSaver :: Lens' (NodeStoryEnv err) (NodeId -> ArchiveList -> DBUpdate err ArchiveList)
hasNodeArchiveStoryImmediateSaver = nse_archive_saver
......@@ -6,7 +6,7 @@ import Data.Text qualified as T
import Gargantext.API.Dev (runCmdReplEasy)
import Gargantext.API.Errors.Types (BackendInternalError(InternalNodeError))
import Gargantext.Core (Lang(EN))
import Gargantext.Core.NodeStory.Types (HasNodeStoryEnv)
import Gargantext.Core.NodeStory.Types (HasNodeStoryEnv, hasNodeStory)
import Gargantext.Core.NLP (HasNLPServer)
import Gargantext.Core.Text.Corpus.Query qualified as Q
import Gargantext.Core.Text.List.Social (FlowSocialListWith (..), FlowSocialListPriority (..))
......@@ -19,7 +19,7 @@ import Gargantext.Database.Action.Search (searchInCorpus)
import Gargantext.Database.Action.User (getUserId)
import Gargantext.Database.Admin.Types.Hyperdata.Corpus (HyperdataCorpus, hc_lang)
import Gargantext.Database.Admin.Types.Node (CorpusId, NodeId(UnsafeMkNodeId), NodeType(..), nodeId2ContextId)
import Gargantext.Database.Prelude (DBCmdWithEnv)
import Gargantext.Database.Prelude
import Gargantext.Database.Query.Facet.Types (facetDoc_id)
import Gargantext.Database.Query.Table.Node (insertDefaultNode, copyNodeStories, defaultList, getNodeWithType)
import Gargantext.Database.Query.Table.Node.Document.Add qualified as Document (add)
......@@ -48,7 +48,8 @@ subcorpusEasy username cId rawQuery reuseParentList = do
-- is inserted in the tree as a child of the parent corpus.
-- Creation of subcorpus "Docs" and "Terms" nodes is handled. The terms can be
-- either copied from the parent corpus or recomputed based on the subcorpus docs.
makeSubcorpusFromQuery :: ( HasNodeStoryEnv env
-- TODO(adn) Make (more) DB-transactional.
makeSubcorpusFromQuery :: ( HasNodeStoryEnv env BackendInternalError
, HasNLPServer env
)
=> User -- ^ The corpus owner
......@@ -57,32 +58,43 @@ makeSubcorpusFromQuery :: ( HasNodeStoryEnv env
-> Bool -- ^ Whether to reuse parent term list (True) or compute a new one based only on the documents in the subcorpus (False)
-> DBCmdWithEnv env BackendInternalError CorpusId -- ^ The child corpus ID
makeSubcorpusFromQuery user supercorpusId query reuseParentList = do
userId <- getUserId user
-- Insert the required nodes:
-- 1. The subcorpus root (under the original corpus root)
subcorpusId <- insertDefaultNode NodeCorpus supercorpusId userId
-- 2. The context (aka "Docs", aka "Terms") node (under the subcorpus root)
_ <- insertDefaultNode NodeTexts subcorpusId userId
-- 3. The terms (aka "List") node
subListId <- insertDefaultNode NodeList subcorpusId userId
-- Get the ID of the original terms node
superListId <- defaultList supercorpusId
-- Get ahold of all contexts that match the query, and add them to the subcorpus
-- (note that contexts are attached to a *corpus* node, not a *docs* node,
-- notwithstanding what you might think from th UI)
facetDocs <- searchInCorpus supercorpusId False query Nothing Nothing Nothing
_ <- Document.add subcorpusId $ nodeId2ContextId . facetDoc_id <$> facetDocs
env <- view hasNodeStory
(subcorpusId, subListId, superListId) <- runDBTx $ do
userId <- getUserId user
-- Insert the required nodes:
-- 1. The subcorpus root (under the original corpus root)
subcorpusId' <- insertDefaultNode NodeCorpus supercorpusId userId
-- 2. The context (aka "Docs", aka "Terms") node (under the subcorpus root)
_ <- insertDefaultNode NodeTexts subcorpusId' userId
-- 3. The terms (aka "List") node
subListId' <- insertDefaultNode NodeList subcorpusId' userId
-- Get the ID of the original terms node
superListId' <- defaultList supercorpusId
-- Get ahold of all contexts that match the query, and add them to the subcorpus
-- (note that contexts are attached to a *corpus* node, not a *docs* node,
-- notwithstanding what you might think from th UI)
facetDocs <- searchInCorpus supercorpusId False query Nothing Nothing Nothing
void $ Document.add subcorpusId' $ nodeId2ContextId . facetDoc_id <$> facetDocs
pure (subcorpusId', subListId', superListId')
if reuseParentList
-- Either simply copy parent terms...
then void $ copyNodeStories superListId subListId
then runDBTx $ void $ copyNodeStories superListId subListId
-- ... or rebuild a term list from scratch
-- TODO Check whether reusing the parent hyperdata is the right thing to do
else do
-- Get hyperdata from the original corpus
supercorpuses <- getNodeWithType supercorpusId NodeCorpus (Proxy :: Proxy HyperdataCorpus)
supercorpuses <- runDBQuery $ getNodeWithType supercorpusId NodeCorpus (Proxy :: Proxy HyperdataCorpus)
superHyperdata <- case supercorpuses of
[supercorpus] -> return $ view node_hyperdata supercorpus
_ -> throwError $ InternalNodeError NoCorpusFound
-- NOTE(adn) Unfortunately this function prevents us from running the whole
-- function in a single DBTx, because that relies deep down its guts to
-- the NLP server to extract the ngrams, something that could happen before
-- calling this.
buildSocialList
(fromMaybe EN $ view hc_lang superHyperdata)
user
......@@ -95,7 +107,8 @@ makeSubcorpusFromQuery user supercorpusId query reuseParentList = do
(Just (FlowSocialListWithPriority MySelfFirst) :: Maybe FlowSocialListWith)
-- In both cases we'll need to reindex our terms list so it matches the contexts
-- in the newly created subcorpus
reIndexWith subcorpusId subListId NgramsTerms (Set.singleton MapTerm)
_ <- updateContextScore subcorpusId subListId
_ <- updateNgramsOccurrences subcorpusId subListId
return subcorpusId
runDBTx $ do
reIndexWith env subcorpusId subListId NgramsTerms (Set.singleton MapTerm)
_ <- updateContextScore env subcorpusId subListId
_ <- updateNgramsOccurrences env subcorpusId subListId
pure subcorpusId
......@@ -25,8 +25,6 @@ import Data.Map.Strict qualified as Map
import Data.Set qualified as Set
import Data.Tuple.Extra (both)
import Gargantext.API.Ngrams.Types (NgramsElement, NgramsTerm(..))
import Gargantext.Core.NLP (HasNLPServer)
import Gargantext.Core.NodeStory.Types ( HasNodeStory )
import Gargantext.Core.Text (size)
import Gargantext.Core.Text.List.Group ( toGroupedTree, setScoresWithMap, toGroupedTreeInstitutes )
import Gargantext.Core.Text.List.Group.Prelude
......@@ -41,7 +39,7 @@ import Gargantext.Data.HashMap.Strict.Utils qualified as HashMap
import Gargantext.Database.Action.Metrics.NgramsByContext (getContextsByNgramsUser, getContextsByNgramsOnlyUser, getTreeInstitutesUser)
import Gargantext.Database.Action.Metrics.TFICF (getTficf_withSample)
import Gargantext.Database.Admin.Types.Node ( MasterCorpusId, UserCorpusId, ContextId )
import Gargantext.Database.Prelude (DBCmd)
import Gargantext.Database.Prelude
import Gargantext.Database.Query.Table.NgramsPostag (selectLems)
import Gargantext.Database.Query.Table.Node (defaultList)
import Gargantext.Database.Query.Table.Node.Error (HasNodeError())
......@@ -61,13 +59,11 @@ isStopTerm (StopSize n) x = Text.length x < n || any isStopChar (Text.unpack x)
-- | Good value from users' requests and anthropological analysis
goodMapListSize :: Int
goodMapListSize = 350
-- | Consider using `buildSocialList` instead of this function.
-- TODO improve grouping functions of Authors, Sources, Institutes..
buildNgramsLists :: ( HasNodeStory env err m
, HasNLPServer env
, HasTreeError err
buildNgramsLists :: ( HasTreeError err
, HasNodeError err
)
=> User
......@@ -75,7 +71,7 @@ buildNgramsLists :: ( HasNodeStory env err m
-> MasterCorpusId
-> Maybe FlowSocialListWith
-> GroupParams
-> m (Map NgramsType [NgramsElement])
-> DBQuery err x (Map NgramsType [NgramsElement])
buildNgramsLists user uCid mCid mfslw gp = do
ngTerms <- buildNgramsTermsList user uCid mCid mfslw gp (NgramsTerms, MapListSize goodMapListSize)
instTerms <- buildNgramsInstList user uCid mfslw GroupIdentity (Institutes, MapListSize 300, MaxListSize 1000)
......@@ -92,16 +88,14 @@ newtype MaxListSize = MaxListSize { unMaxListSize :: Int }
buildNgramsInstList :: ( HasNodeError err
, HasNLPServer env
, HasNodeStory env err m
, HasTreeError err
)
, HasTreeError err
)
=> User
-> UserCorpusId
-> Maybe FlowSocialListWith
-> GroupParams
-> (NgramsType, MapListSize, MaxListSize)
-> m (Map NgramsType [NgramsElement])
-> DBQuery err x (Map NgramsType [NgramsElement])
buildNgramsInstList user uCid mfslw _groupParams (nt, MapListSize mapListSize, MaxListSize maxListSize) = do
allTerms :: HashMap NgramsTerm (Set ContextId) <- getContextsByNgramsUser uCid nt
......@@ -137,8 +131,6 @@ buildNgramsInstList user uCid mfslw _groupParams (nt, MapListSize mapListSize, M
)]
buildNgramsOthersList :: ( HasNodeError err
, HasNLPServer env
, HasNodeStory env err m
, HasTreeError err
)
=> User
......@@ -146,7 +138,7 @@ buildNgramsOthersList :: ( HasNodeError err
-> Maybe FlowSocialListWith
-> GroupParams
-> (NgramsType, MapListSize, MaxListSize)
-> m (Map NgramsType [NgramsElement])
-> DBQuery err x (Map NgramsType [NgramsElement])
buildNgramsOthersList user uCid mfslw _groupParams (nt, MapListSize mapListSize, MaxListSize maxListSize) = do
allTerms :: HashMap NgramsTerm (Set ContextId) <- getContextsByNgramsUser uCid nt
......@@ -171,7 +163,7 @@ buildNgramsOthersList user uCid mfslw _groupParams (nt, MapListSize mapListSize,
$ List.take maxListSize
$ List.sortOn (Down . viewScore . snd)
$ HashMap.toList tailTerms'
pure $ Map.fromList [( nt, List.take maxListSize $ toNgramsElement stopTerms
<> toNgramsElement mapTerms
<> toNgramsElement (setListType (Just MapTerm ) mapTerms')
......@@ -181,7 +173,7 @@ buildNgramsOthersList user uCid mfslw _groupParams (nt, MapListSize mapListSize,
-- | https://gitlab.iscpif.fr/gargantext/haskell-gargantext/issues/169#note_10049
-- Stemming can be useful if you do not have any context: ok for full text search then.
--
--
-- In document, we have context so we can add grammar and linguistics
-- rules to be more precise than the stemmatization, that is why the
-- lemmatization is used here to group. Basically it will avoid
......@@ -190,7 +182,7 @@ buildNgramsOthersList user uCid mfslw _groupParams (nt, MapListSize mapListSize,
getGroupParams :: ( HasNodeError err
, HasTreeError err
)
=> GroupParams -> HashSet Ngrams -> DBCmd err GroupParams
=> GroupParams -> HashSet Ngrams -> DBQuery err x GroupParams
getGroupParams gp@(GroupWithPosTag { .. }) ng = do
!hashMap <- HashMap.fromList <$> selectLems _gwl_lang _gwl_nlp_config (HashSet.toList ng)
-- printDebug "hashMap" hashMap
......@@ -200,8 +192,6 @@ getGroupParams gp _ = pure gp
-- TODO use ListIds
buildNgramsTermsList :: ( HasNodeError err
, HasNLPServer env
, HasNodeStory env err m
, HasTreeError err
)
=> User
......@@ -210,7 +200,7 @@ buildNgramsTermsList :: ( HasNodeError err
-> Maybe FlowSocialListWith
-> GroupParams
-> (NgramsType, MapListSize)
-> m (Map NgramsType [NgramsElement])
-> DBQuery err x (Map NgramsType [NgramsElement])
buildNgramsTermsList user uCid mCid mfslw groupParams (nt, MapListSize mapListSize) = do
-- Filter 0 With Double
......
......@@ -13,12 +13,10 @@ Portability : POSIX
module Gargantext.Core.Text.List.Social
where
import Control.Lens (view)
import Data.Aeson
import Data.HashMap.Strict (HashMap)
import Data.List qualified as List
import Data.Map.Strict qualified as Map
import Data.Pool ( withResource )
import Data.Swagger ( ToSchema(..), genericDeclareNamedSchema, defaultSchemaOptions )
import Data.Text qualified as T
import Data.Vector qualified as V
......@@ -30,7 +28,7 @@ import Gargantext.Core.Text.List.Social.Prelude (FlowCont, FlowListScores)
import Gargantext.Core.Text.Ngrams (NgramsType)
import Gargantext.Core.Types.Individu (User)
import Gargantext.Database.Admin.Types.Node (ListId)
import Gargantext.Database.Prelude (DBCmd, connPool)
import Gargantext.Database.Prelude
import Gargantext.Database.Query.Table.Node.Error (HasNodeError)
import Gargantext.Database.Query.Tree (NodeMode(Private), HasTreeError)
import Gargantext.Prelude
......@@ -120,7 +118,7 @@ flowSocialList :: ( HasNodeError err
-> User
-> NgramsType
-> FlowCont NgramsTerm FlowListScores
-> DBCmd err (FlowCont NgramsTerm FlowListScores)
-> DBQuery err x (FlowCont NgramsTerm FlowListScores)
flowSocialList Nothing u = flowSocialList' MySelfFirst u
flowSocialList (Just (FlowSocialListWithPriority p)) u = flowSocialList' p u
flowSocialList (Just (FlowSocialListWithLists ls)) _ = getHistoryScores ls
......@@ -132,7 +130,7 @@ flowSocialList' :: ( HasNodeError err
=> FlowSocialListPriority
-> User -> NgramsType
-> FlowCont NgramsTerm FlowListScores
-> DBCmd err (FlowCont NgramsTerm FlowListScores)
-> DBQuery err x (FlowCont NgramsTerm FlowListScores)
flowSocialList' flowPriority user nt flc =
mconcat <$> mapM (flowSocialListByMode' user nt flc)
(flowSocialListPriority flowPriority)
......@@ -144,7 +142,7 @@ flowSocialList' flowPriority user nt flc =
=> User -> NgramsType
-> FlowCont NgramsTerm FlowListScores
-> NodeMode
-> DBCmd err (FlowCont NgramsTerm FlowListScores)
-> DBQuery err x (FlowCont NgramsTerm FlowListScores)
flowSocialListByMode' user' nt' flc' mode =
findListsId user' mode
>>= flowSocialListByModeWith nt' flc'
......@@ -156,7 +154,7 @@ flowSocialList' flowPriority user nt flc =
=> NgramsType
-> FlowCont NgramsTerm FlowListScores
-> [ListId]
-> DBCmd err (FlowCont NgramsTerm FlowListScores)
-> DBQuery err x (FlowCont NgramsTerm FlowListScores)
flowSocialListByModeWith nt'' flc'' listes =
getHistoryScores listes nt'' flc''
......@@ -168,7 +166,7 @@ getHistoryScores :: ( HasNodeError err
=> [ListId]
-> NgramsType
-> FlowCont NgramsTerm FlowListScores
-> DBCmd err (FlowCont NgramsTerm FlowListScores)
-> DBQuery err x (FlowCont NgramsTerm FlowListScores)
getHistoryScores lists nt fl =
addScorePatches nt lists fl <$> getHistory [nt] lists
......@@ -178,10 +176,9 @@ getHistory :: ( HasNodeError err
)
=> [NgramsType]
-> [ListId]
-> DBCmd err (Map ListId (Map NgramsType [HashMap NgramsTerm NgramsPatch]))
-> DBQuery err x (Map ListId (Map NgramsType [HashMap NgramsTerm NgramsPatch]))
getHistory types listsId = do
pool <- view connPool
nsp <- liftBase $ withResource pool $ \c -> getNodesArchiveHistory c listsId
nsp <- getNodesArchiveHistory listsId
pure $ Map.map (Map.filterWithKey (\k _ -> List.elem k types))
$ Map.filterWithKey (\k _ -> List.elem k listsId)
$ Map.fromListWith (Map.unionWith (<>)) nsp
......@@ -18,7 +18,7 @@ import Gargantext.Core (toDBid)
import Gargantext.Core.Types.Individu
import Gargantext.Database.Admin.Config ()
import Gargantext.Database.Admin.Types.Node
import Gargantext.Database.Prelude (DBCmd)
import Gargantext.Database.Prelude
import Gargantext.Database.Query.Table.Node.Error
import Gargantext.Database.Query.Tree
import Gargantext.Database.Query.Tree.Root (getRootId)
......@@ -28,7 +28,7 @@ import Gargantext.Database.Schema.Node
------------------------------------------------------------------------
findListsId :: (HasNodeError err, HasTreeError err)
=> User -> NodeMode -> DBCmd err [NodeId]
=> User -> NodeMode -> DBQuery err x [NodeId]
findListsId u mode = do
rootId <- getRootId u
userNode <- getNode rootId
......@@ -45,7 +45,7 @@ findNodes' :: (HasTreeError err, HasNodeError err)
=> UserId
-> RootId
-> NodeMode
-> DBCmd err [DbTreeNode]
-> DBQuery err x [DbTreeNode]
findNodes' loggedInUserId r Private = do
pv <- (findNodes loggedInUserId r Private $ [NodeFolderPrivate] <> commonNodes)
sh <- (findNodes' loggedInUserId r Shared)
......
......@@ -54,7 +54,7 @@ import Gargantext.Core.Text.Terms.Mono.Token.En (tokenize)
import Gargantext.Core.Text.Terms.Multi (multiterms)
import Gargantext.Core.Types ( TermsCount, TermsWeight, POS, Terms(..), TermsWithCount )
import Gargantext.Core.Utils (groupWithCounts)
import Gargantext.Database.Prelude (DBCmd)
import Gargantext.Database.Prelude
import Gargantext.Database.Query.Table.Ngrams (insertNgrams)
import Gargantext.Database.Query.Table.NgramsPostag (NgramsPostag(..), insertNgramsPostag, np_form, np_lem)
import Gargantext.Database.Schema.Ngrams (text2ngrams, NgramsId)
......@@ -151,7 +151,7 @@ extracted2ngrams (SimpleNgrams ng) = ng
extracted2ngrams (EnrichedNgrams ng) = view np_form ng
---------------------------
insertExtractedNgrams :: [ ExtractedNgrams ] -> DBCmd err (HashMap Text NgramsId)
insertExtractedNgrams :: [ ExtractedNgrams ] -> DBUpdate err (HashMap Text NgramsId)
insertExtractedNgrams ngs = do
let (s, e) = List.partition isSimpleNgrams ngs
m1 <- insertNgrams (map unSimpleNgrams s)
......
......@@ -20,7 +20,7 @@ import Data.Vector qualified as V
import Gargantext.API.Ngrams.NgramsTree ( toTree, NgramsTree )
import Gargantext.API.Ngrams.Tools ( filterListWithRoot, getListNgrams, getRepo, mapTermListRoot )
import Gargantext.API.Ngrams.Types ( NgramsTerm(NgramsTerm) )
import Gargantext.Core.NodeStory.Types ( HasNodeStory )
import Gargantext.Core.NodeStory.Types ( NodeStoryEnv )
import Gargantext.Core.Text.Metrics.Count (occurrencesWith)
import Gargantext.Core.Text.Ngrams (NgramsType)
import Gargantext.Core.Types.Main ( ListType )
......@@ -28,7 +28,7 @@ import Gargantext.Database.Admin.Types.Node ( NodeType(NodeList), CorpusId, cont
import Gargantext.Core.Viz.Types ( Histo(Histo) )
import Gargantext.Database.Action.Metrics.NgramsByContext ( countContextsByNgramsWith, getContextsByNgramsOnlyUser )
import Gargantext.Database.Admin.Config ( userMaster )
import Gargantext.Database.Prelude (DBCmd)
import Gargantext.Database.Prelude (DBQuery)
import Gargantext.Database.Query.Table.Node ( getListsWithParentId )
import Gargantext.Database.Query.Table.Node.Select ( selectNodesWithUsername )
import Gargantext.Database.Query.Table.NodeContext (selectDocsDates)
......@@ -36,7 +36,7 @@ import Gargantext.Database.Schema.Node ( NodePoly(_node_id) )
import Gargantext.Prelude hiding (toList)
histoData :: CorpusId -> DBCmd err Histo
histoData :: CorpusId -> DBQuery err x Histo
histoData cId = do
dates <- selectDocsDates cId
let (ls, css) = V.unzip
......@@ -47,13 +47,15 @@ histoData cId = do
pure (Histo ls css)
chartData :: HasNodeStory env err m
=> CorpusId -> NgramsType -> ListType
-> m Histo
chartData cId nt lt = do
chartData :: NodeStoryEnv err
-> CorpusId
-> NgramsType
-> ListType
-> DBQuery err x Histo
chartData env cId nt lt = do
ls' <- selectNodesWithUsername NodeList userMaster
ls <- map (_node_id) <$> getListsWithParentId cId
ts <- mapTermListRoot ls nt <$> getRepo ls
ts <- mapTermListRoot ls nt <$> getRepo env ls
let
dico = filterListWithRoot [lt] ts
terms = catMaybes $ List.concat $ map (\(a,b) -> [Just a, b]) $ HashMap.toList dico
......@@ -71,13 +73,15 @@ chartData cId nt lt = do
pure (Histo dates (round <$> count))
treeData :: HasNodeStory env err m
=> CorpusId -> NgramsType -> ListType
-> m (V.Vector NgramsTree)
treeData cId nt lt = do
treeData :: NodeStoryEnv err
-> CorpusId
-> NgramsType
-> ListType
-> DBQuery err x (V.Vector NgramsTree)
treeData env cId nt lt = do
ls' <- selectNodesWithUsername NodeList userMaster
ls <- map (_node_id) <$> getListsWithParentId cId
ts <- mapTermListRoot ls nt <$> getRepo ls
ts <- mapTermListRoot ls nt <$> getRepo env ls
let
dico = filterListWithRoot [lt] ts
......@@ -86,5 +90,5 @@ treeData cId nt lt = do
-- FIXME(adn) Audit the usage, as we are converting between a context id to a node id.
cs' <- HashMap.map (Set.map contextId2NodeId) <$> getContextsByNgramsOnlyUser cId (ls' <> ls) nt terms
m <- getListNgrams ls nt
m <- getListNgrams env ls nt
pure $ V.fromList $ toTree lt cs' m
......@@ -17,7 +17,7 @@ Portability : POSIX
module Gargantext.Core.Viz.Graph.API
where
import Control.Lens (set, _Just, (^?), at)
import Control.Lens (set, _Just, (^?), at, view)
import Data.HashMap.Strict qualified as HashMap
import Gargantext.API.Admin.EnvTypes (Env)
import Gargantext.API.Errors.Types ( BackendInternalError )
......@@ -26,7 +26,7 @@ import Gargantext.API.Prelude (GargM)
import Gargantext.API.Routes.Named.Viz qualified as Named
import Gargantext.API.Worker (serveWorkerAPI)
import Gargantext.Core.Methods.Similarities (Similarity(..), GraphMetric(..), withMetric)
import Gargantext.Core.NodeStory.Types ( HasNodeStory, a_version, unNodeStory, NodeListStory )
import Gargantext.Core.NodeStory.Types ( HasNodeStory, a_version, unNodeStory, NodeListStory, NodeStoryEnv, hasNodeStory, HasNodeStoryEnv )
import Gargantext.Core.Text.Ngrams (NgramsType(..))
import Gargantext.Core.Types.Main ( ListType(MapTerm) )
import Gargantext.Core.Viz.Graph.GEXF ()
......@@ -37,7 +37,7 @@ import Gargantext.Database.Action.Metrics.NgramsByContext (getContextsByNgramsOn
import Gargantext.Database.Action.Node (mkNodeWithParent)
import Gargantext.Database.Admin.Config ( userMaster )
import Gargantext.Database.Admin.Types.Node
import Gargantext.Database.Prelude (DBCmd, DBCmdWithEnv)
import Gargantext.Database.Prelude
import Gargantext.Database.Query.Table.Node ( getOrMkList, getNodeWith, defaultList, getClosestParentIdByType )
import Gargantext.Database.Query.Table.Node.Error (HasNodeError)
import Gargantext.Database.Query.Table.Node.Select ( selectNodesWithUsername )
......@@ -47,25 +47,28 @@ import Gargantext.Prelude
import Gargantext.Utils.Jobs.Monad (MonadJobStatus(..))
import Servant
import Servant.Server.Generic (AsServerT)
import Gargantext.Core.Config (GargConfig)
------------------------------------------------------------------------
--getGraph :: UserId -> NodeId -> GargServer HyperdataGraphAPI
getGraph :: HasNodeStory env err m
-- TODO(adn) DB-transactional
getGraph :: (HasNodeStoryEnv env err, HasNodeError err, IsDBCmd env err m)
=> NodeId
-> m HyperdataGraphAPI
getGraph nId = do
nodeGraph <- getNodeWith nId (Proxy :: Proxy HyperdataGraph)
env <- view hasNodeStory
nodeGraph <- runDBQuery $ getNodeWith nId (Proxy :: Proxy HyperdataGraph)
let
graph = nodeGraph ^. node_hyperdata . hyperdataGraph
camera = nodeGraph ^. node_hyperdata . hyperdataCamera
mcId <- getClosestParentIdByType nId NodeCorpus
mcId <- runDBQuery $ getClosestParentIdByType nId NodeCorpus
let cId = maybe (panic "[G.V.G.API] Node has no parent") identity mcId
-- printDebug "[getGraph] getting list for cId" cId
listId <- defaultList cId
repo <- getRepo [listId]
listId <- runDBQuery $ defaultList cId
repo <- runDBQuery $ getRepo env [listId]
-- TODO Similarity in Graph params
case graph of
......@@ -80,7 +83,7 @@ getGraph nId = do
graph'' = set graph_metadata (Just mt') graph'
hg = HyperdataGraphAPI graph'' camera
-- _ <- updateHyperdata nId hg
_ <- updateHyperdata nId (HyperdataGraph (Just graph'') camera)
_ <- runDBTx $ updateHyperdata nId (HyperdataGraph (Just graph'') camera)
pure $ trace ("[G.V.G.API] Graph empty, computing" :: Text) hg
Just graph' -> pure $ trace ("[G.V.G.API] Graph exists, returning" :: Text) $
......@@ -88,7 +91,8 @@ getGraph nId = do
--recomputeGraph :: UserId -> NodeId -> Maybe GraphMetric -> GargNoServer Graph
recomputeGraph :: HasNodeStory env err m
-- TODO(adn) make db-transactional.
recomputeGraph :: (HasNodeStoryEnv env err, IsDBCmd env err m, HasNodeError err)
=> NodeId
-> BridgenessMethod
-> Maybe GraphMetric
......@@ -98,7 +102,8 @@ recomputeGraph :: HasNodeStory env err m
-> Bool
-> m Graph
recomputeGraph nId bridgeMethod maybeSimilarity maybeStrength nt1 nt2 force' = do
nodeGraph <- getNodeWith nId (Proxy :: Proxy HyperdataGraph)
env <- view hasNodeStory
nodeGraph <- runDBQuery $ getNodeWith nId (Proxy :: Proxy HyperdataGraph)
let
graph = nodeGraph ^. node_hyperdata . hyperdataGraph
camera = nodeGraph ^. node_hyperdata . hyperdataCamera
......@@ -117,18 +122,18 @@ recomputeGraph nId bridgeMethod maybeSimilarity maybeStrength nt1 nt2 force' = d
Just mr -> fromMaybe Strong mr
Just r -> r
mcId <- getClosestParentIdByType nId NodeCorpus
mcId <- runDBQuery $ getClosestParentIdByType nId NodeCorpus
let cId = maybe (panic "[G.V.G.API] Node has no parent") identity mcId
listId <- defaultList cId
repo <- getRepo [listId]
listId <- runDBQuery $ defaultList cId
repo <- runDBQuery $ getRepo env [listId]
let v = repo ^. unNodeStory . at listId . _Just . a_version
let computeG mt = do
!g <- computeGraph cId bridgeMethod similarity strength (nt1,nt2) repo
let mt' = set gm_legend (generateLegend g) mt
let g' = set graph_metadata (Just mt') g
_nentries <- updateHyperdata nId (HyperdataGraph (Just g') camera)
_nentries <- runDBTx $ updateHyperdata nId (HyperdataGraph (Just g') camera)
pure g'
case graph of
......@@ -150,6 +155,7 @@ recomputeGraph nId bridgeMethod maybeSimilarity maybeStrength nt1 nt2 force' = d
-- TODO remove repo
-- TODO(adn) DB-transactional
computeGraph :: HasNodeError err
=> CorpusId
-> BridgenessMethod
......@@ -160,8 +166,7 @@ computeGraph :: HasNodeError err
-> DBCmd err Graph
computeGraph corpusId bridgeMethod similarity strength (nt1,nt2) repo = do
-- Getting the Node parameters
lId <- defaultList corpusId
lIds <- selectNodesWithUsername NodeList userMaster
(lId, lIds) <- runDBQuery $ ((,) <$> defaultList corpusId <*> selectNodesWithUsername NodeList userMaster)
-- Getting the Ngrams to compute with and grouping it according to the lists
let
......@@ -173,12 +178,12 @@ computeGraph corpusId bridgeMethod similarity strength (nt1,nt2) repo = do
-- Optim if nt1 == nt2 : do not compute twice
(m1,m2) <- do
m1 <- groupedContextsByNgrams nt1 corpusId (lIds, [lId])
m1 <- runDBQuery $ groupedContextsByNgrams nt1 corpusId (lIds, [lId])
if nt1 == nt2
then
pure (m1,m1)
else do
m2 <- groupedContextsByNgrams nt2 corpusId (lIds, [lId])
m2 <- runDBQuery $ groupedContextsByNgrams nt2 corpusId (lIds, [lId])
pure (m1,m2)
-- Removing the hapax (ngrams with 1 cooc)
......@@ -239,11 +244,12 @@ graphRecompute n jobHandle = do
_g <- recomputeGraph n BridgenessBasic Nothing Nothing NgramsTerms NgramsTerms False
markComplete jobHandle
graphVersions :: (HasNodeStory env err m)
=> UserId
graphVersions :: HasNodeError err
=> NodeStoryEnv err
-> UserId
-> NodeId
-> m GraphVersions
graphVersions u nId = do
-> DBUpdate err GraphVersions
graphVersions env u nId = do
nodeGraph <- getNodeWith nId (Proxy :: Proxy HyperdataGraph)
let
graph = nodeGraph
......@@ -261,7 +267,7 @@ graphVersions u nId = do
let cId = maybe (panic "[G.V.G.API] Node has no parent") identity mcId
listId <- getOrMkList cId u
repo <- getRepo [listId]
repo <- getRepo env [listId]
let v = repo ^. unNodeStory . at listId . _Just . a_version
-- printDebug "graphVersions" v
......@@ -274,16 +280,17 @@ recomputeVersions :: HasNodeStory env err m
recomputeVersions nId = recomputeGraph nId BridgenessBasic Nothing Nothing NgramsTerms NgramsTerms False
------------------------------------------------------------
graphClone :: (HasNodeError err)
=> UserId
graphClone :: HasNodeError err
=> GargConfig
-> UserId
-> NodeId
-> HyperdataGraphAPI
-> DBCmdWithEnv env err NodeId
graphClone userId pId (HyperdataGraphAPI { _hyperdataAPIGraph = graph
-> DBUpdate err NodeId
graphClone cfg userId pId (HyperdataGraphAPI { _hyperdataAPIGraph = graph
, _hyperdataAPICamera = camera }) = do
let nodeType = NodeGraph
nodeParent <- getNodeWith pId (Proxy :: Proxy HyperdataGraph)
nIds <- mkNodeWithParent nodeType (Just pId) userId $ nodeParent ^. node_name
nIds <- mkNodeWithParent cfg nodeType (Just pId) userId $ nodeParent ^. node_name
case nIds of
[] -> pure pId
(nId:_) -> do
......@@ -309,7 +316,7 @@ getGraphGexf nId = do
updateGraphLegend :: HasNodeError err
=> NodeId
-> GraphLegendAPI
-> DBCmd err NodeId
-> DBUpdate err NodeId
updateGraphLegend nId (GraphLegendAPI lg ) = do
nodeGraph <- getNodeWith nId (Proxy :: Proxy HyperdataGraph)
let graph = nodeGraph ^. node_hyperdata . hyperdataGraph
......
......@@ -28,6 +28,7 @@ import Gargantext.Core.Types.Phylo (GraphData(..))
import Gargantext.Core.Viz.LegacyPhylo hiding (Phylo(..))
import Gargantext.Core.Viz.Phylo (PhyloConfig(..), defaultConfig, _phylo_param, _phyloParam_config)
import Gargantext.Core.Viz.Phylo.API.Tools
import Gargantext.Database.Prelude
import Gargantext.Database.Admin.Types.Hyperdata
import Gargantext.Database.Admin.Types.Node -- (PhyloId, ListId, CorpusId, UserId, NodeId(..))
import Gargantext.Database.Query.Table.Node (getClosestParentIdByType, defaultList)
......@@ -50,12 +51,13 @@ phyloAPI n = Named.PhyloAPI
-- Add real text processing
-- Fix Filter parameters
-- TODO fix parameters to default config that should be in Node
-- NOTE(adn) this is not DB-tx safe in regards to reads.
getPhylo :: IsGargServer err env m => PhyloId -> Named.GetPhylo (AsServerT m)
getPhylo phyloId = Named.GetPhylo $ \lId _level _minSizeBranch -> do
corpusId <- maybe (nodeLookupError $ NodeParentDoesNotExist phyloId) pure
=<< getClosestParentIdByType phyloId NodeCorpus
=<< (runDBQuery $ getClosestParentIdByType phyloId NodeCorpus)
listId <- case lId of
Nothing -> defaultList corpusId
Nothing -> runDBQuery $ defaultList corpusId
Just ld -> pure ld
pd <- getPhyloDataJson phyloId
-- printDebug "getPhylo" theData
......@@ -68,7 +70,7 @@ getPhylo phyloId = Named.GetPhylo $ \lId _level _minSizeBranch -> do
getPhyloDataJson :: PhyloId -> GargNoServer (Maybe (GraphData, PhyloConfig))
getPhyloDataJson phyloId = do
phyloData <- getPhyloData phyloId
phyloData <- runDBQuery $ getPhyloData phyloId
phyloJson <- liftBase $ maybePhylo2dot2json phyloData
case phyloJson of
Nothing -> pure Nothing
......@@ -92,6 +94,8 @@ getPhyloDataJson phyloId = do
-- pure (SVG p)
-- FIXME(adn) This handler mixes DB reads with updates outside of the same
-- transaction, due to the call to 'flowPhyloAPI' in the middle.
postPhylo :: IsGargServer err env m => PhyloId -> Named.PostPhylo (AsServerT m)
postPhylo phyloId = Named.PostPhylo $ \_lId -> do
-- TODO get Reader settings
......@@ -100,12 +104,12 @@ postPhylo phyloId = Named.PostPhylo $ \_lId -> do
-- _vrs = Just ("1" :: Text)
-- _sft = Just (Software "Gargantext" "4")
-- _prm = initPhyloParam vrs sft (Just q)
corpusId <- getClosestParentIdByType phyloId NodeCorpus
corpusId <- runDBQuery $ getClosestParentIdByType phyloId NodeCorpus
-- Being the first time we ask for the Phylo, there is no historical data
-- available about computing time, so we pass 'Nothing'.
phy <- flowPhyloAPI defaultConfig Nothing (fromMaybe (panicTrace "[G.C.V.P.API] no corpus ID found") corpusId) -- params
-- phyloId <- insertNodes [node NodePhylo "Phylo" (HyperdataPhylo Nothing (Just phy)) (Just corpusId) userId]
_ <- updateHyperdata phyloId (HyperdataPhylo Nothing (Just phy))
_ <- runDBTx $ updateHyperdata phyloId (HyperdataPhylo Nothing (Just phy))
pure phyloId
------------------------------------------------------------------------
......
......@@ -29,7 +29,7 @@ import Data.Time.Clock.POSIX(posixSecondsToUTCTime)
import Gargantext.API.Ngrams.Prelude (getTermList)
import Gargantext.API.Ngrams.Types (NgramsTerm(..))
import Gargantext.Core (withDefaultLanguage, Lang)
import Gargantext.Core.NodeStory.Types (HasNodeStory)
import Gargantext.Core.NodeStory.Types (HasNodeStory, NodeStoryEnv, HasNodeStoryEnv (..))
import Gargantext.Core.Text.Ngrams (NgramsType(..))
import Gargantext.Core.Text.Terms.WithList (Patterns, buildPatterns, termsInText)
import Gargantext.Core.Types.Main (ListType(MapTerm))
......@@ -41,7 +41,7 @@ import Gargantext.Database.Admin.Types.Hyperdata.Corpus ( HyperdataCorpus(..) )
import Gargantext.Database.Admin.Types.Hyperdata.Document (HyperdataDocument(..))
import Gargantext.Database.Admin.Types.Hyperdata.Phylo ( HyperdataPhylo(..) )
import Gargantext.Database.Admin.Types.Node (Context, CorpusId, ContextId, PhyloId, nodeId2ContextId)
import Gargantext.Database.Prelude (DBCmd)
import Gargantext.Database.Prelude
import Gargantext.Database.Query.Table.Node (defaultList, getNodeWith)
import Gargantext.Database.Query.Table.Node.Error (HasNodeError)
import Gargantext.Database.Query.Table.NodeContext (selectDocNodes)
......@@ -57,7 +57,7 @@ import System.Process qualified as Shell
--------------------------------------------------------------------
getPhyloData :: HasNodeError err
=> PhyloId -> DBCmd err (Maybe Phylo)
=> PhyloId -> DBQuery err x (Maybe Phylo)
getPhyloData phyloId = do
nodePhylo <- getNodeWith phyloId (Proxy :: Proxy HyperdataPhylo)
pure $ _hp_data $ _node_hyperdata nodePhylo
......@@ -117,7 +117,8 @@ flowPhyloAPI :: (HasNodeStory env err m, HasNodeError err, MonadLogger m)
-> CorpusId
-> m Phylo
flowPhyloAPI config mbOldComputeHistory cId = do
corpus <- timeMeasured "flowPhyloAPI.corpusIdtoDocuments" $ corpusIdtoDocuments (timeUnit config) cId
env <- view hasNodeStory
corpus <- timeMeasured "flowPhyloAPI.corpusIdtoDocuments" $ runDBQuery $ corpusIdtoDocuments env (timeUnit config) cId
-- writePhylo phyloWithCliquesFile phyloWithCliques
$(logLocM) DEBUG $ "PhyloConfig old: " <> show config
......@@ -130,12 +131,15 @@ flowPhyloAPI config mbOldComputeHistory cId = do
pure $! trackComputeTime (t1 + t2 + t3) (finalPhylo { _phylo_computeTime = mbOldComputeHistory })
--------------------------------------------------------------------
corpusIdtoDocuments :: (HasNodeStory env err m, HasNodeError err)
=> TimeUnit -> CorpusId -> m [Document]
corpusIdtoDocuments timeUnit corpusId = do
corpusIdtoDocuments :: HasNodeError err
=> NodeStoryEnv err
-> TimeUnit
-> CorpusId
-> DBQuery err x [Document]
corpusIdtoDocuments env timeUnit corpusId = do
docs <- selectDocNodes corpusId
lId <- defaultList corpusId
termList <- getTermList lId MapTerm NgramsTerms
termList <- getTermList env lId MapTerm NgramsTerms
corpus_node <- getNodeWith corpusId (Proxy @HyperdataCorpus)
let corpusLang = view (node_hyperdata . to _hc_lang) corpus_node
......
......@@ -25,7 +25,7 @@ import Data.Text qualified as Text
import Gargantext.API.Ngrams.Tools (getTermsWith)
import Gargantext.API.Ngrams.Types (NgramsTerm(..))
import Gargantext.Core (HasDBid, withDefaultLanguage)
import Gargantext.Core.NodeStory.Types (HasNodeStory)
import Gargantext.Core.NodeStory.Types (HasNodeStory, HasNodeStoryEnv (..))
import Gargantext.Core.Text.Context (TermList)
import Gargantext.Core.Text.Ngrams (NgramsType(..))
import Gargantext.Core.Text.Terms.WithList ( buildPatterns, termsInText, Patterns )
......@@ -37,36 +37,38 @@ import Gargantext.Database.Admin.Types.Hyperdata.Document ( HyperdataDocument(_h
import Gargantext.Database.Query.Table.Node(defaultList, getNodeWith)
import Gargantext.Database.Query.Table.NodeContext (selectDocs)
import Gargantext.Database.Schema.Node ( node_hyperdata )
import Gargantext.Database.Prelude
import Gargantext.Prelude hiding (to)
type MinSizeBranch = Int
flowPhylo :: (HasNodeStory env err m, HasDBid NodeType)
flowPhylo :: (HasNodeStory env err m, HasDBid NodeType, IsDBCmd env err m)
=> CorpusId
-> m Phylo
flowPhylo cId = do
corpus_node <- getNodeWith cId (Proxy @HyperdataCorpus)
let lang = withDefaultLanguage $ view (node_hyperdata . to _hc_lang) corpus_node
list' <- defaultList cId
termList <- HashMap.toList <$> getTermsWith (Text.words . unNgramsTerm) [list'] NgramsTerms (Set.singleton MapTerm)
docs' <- catMaybes
<$> map (\h -> (,) <$> _hd_publication_year h
<*> _hd_abstract h
)
<$> selectDocs cId
let
patterns = buildPatterns termList
-- | To filter the Ngrams of a document based on the termList
filterTerms :: Patterns -> (Date, Text) -> (Date, [Text])
filterTerms patterns' (y,d) = (y, fst <$> termsInText lang patterns' d)
docs = map ((\(y,t) -> Document y t) . filterTerms patterns) docs'
--liftBase $ flowPhylo' (List.sortOn date docs) termList l m fp
pure $ buildPhylo (List.sortOn date docs) termList
env <- view hasNodeStory
runDBQuery $ do
corpus_node <- getNodeWith cId (Proxy @HyperdataCorpus)
let lang = withDefaultLanguage $ view (node_hyperdata . to _hc_lang) corpus_node
list' <- defaultList cId
termList <- HashMap.toList <$> getTermsWith env (Text.words . unNgramsTerm) [list'] NgramsTerms (Set.singleton MapTerm)
docs' <- catMaybes
<$> map (\h -> (,) <$> _hd_publication_year h
<*> _hd_abstract h
)
<$> selectDocs cId
let
patterns = buildPatterns termList
-- | To filter the Ngrams of a document based on the termList
filterTerms :: Patterns -> (Date, Text) -> (Date, [Text])
filterTerms patterns' (y,d) = (y, fst <$> termsInText lang patterns' d)
docs = map ((\(y,t) -> Document y t) . filterTerms patterns) docs'
--liftBase $ flowPhylo' (List.sortOn date docs) termList l m fp
pure $ buildPhylo (List.sortOn date docs) termList
-- TODO SortedList Document
......
......@@ -53,7 +53,7 @@ import Gargantext.Core.Worker.Env
import Gargantext.Core.Worker.Jobs.Types (Job(..), getWorkerMNodeId, ImportRemoteDocumentsPayload(..), ImportRemoteTermsPayload(..))
import Gargantext.Core.Worker.PGMQTypes (BrokerMessage, HasWorkerBroker, WState)
import Gargantext.Core.Worker.Types (JobInfo(..))
import Gargantext.Database.Prelude (readLargeObject, removeLargeObject)
import Gargantext.Database.Prelude
import Gargantext.Database.Query.Table.User (getUsersWithEmail)
import Gargantext.Prelude hiding (to)
import Gargantext.System.Logging
......@@ -261,7 +261,7 @@ performAction env _state bm = do
-- | Forgot password task
ForgotPasswordAsync { _fpa_args = ForgotPasswordAsyncParams { email } } -> runWorkerMonad env $ do
$(logLocM) DEBUG $ "[performAction] forgot password: " <> email
us <- getUsersWithEmail (T.toLower email)
us <- runDBQuery $ getUsersWithEmail (T.toLower email)
case us of
[u] -> forgotUserPassword u
_ -> pure ()
......
......@@ -26,18 +26,20 @@ import Control.Monad.Trans.Control (MonadBaseControl)
import Data.Maybe (fromJust)
import Data.Pool qualified as Pool
import Database.PostgreSQL.Simple qualified as PSQL
import GHC.IO.Exception (IOException(..), IOErrorType(OtherError))
import Gargantext.API.Admin.Orchestrator.Types (JobLog, noJobLog)
import Gargantext.API.Errors (BackendInternalError)
import Gargantext.API.Job (RemainingSteps(..), jobLogStart, jobLogProgress, jobLogFailures, jobLogComplete, addErrorEvent, jobLogFailTotal, jobLogFailTotalWithMessage, jobLogAddMore, addWarningEvent)
import Gargantext.API.Prelude (GargM)
import Gargantext.Core.Notifications.CentralExchange qualified as CE
import Gargantext.Core.Notifications.CentralExchange.Types qualified as CET
import Gargantext.Core.Config (GargConfig(..), HasConfig(..), gc_logging, LogConfig)
import Gargantext.Core.Config.Mail qualified as Mail
import Gargantext.Core.Config.Utils (readConfig)
import Gargantext.Core.Config.Types (SettingsFile(..))
import Gargantext.Core.Config.Utils (readConfig)
import Gargantext.Core.Mail.Types (HasMail(..))
import Gargantext.Core.NLP (HasNLPServer(..), NLPServerMap, nlpServerMap)
import Gargantext.Core.NodeStory (HasNodeStoryEnv(..), HasNodeStoryImmediateSaver(..), HasNodeArchiveStoryImmediateSaver(..), NodeStoryEnv, fromDBNodeStoryEnv, nse_saver_immediate, nse_archive_saver_immediate)
import Gargantext.Core.NodeStory (HasNodeStoryEnv(..), NodeStoryEnv, mkNodeStoryEnv)
import Gargantext.Core.Notifications.CentralExchange qualified as CE
import Gargantext.Core.Notifications.CentralExchange.Types qualified as CET
import Gargantext.Core.Types (HasValidationError(..))
import Gargantext.Core.Worker.Types (JobInfo(..))
import Gargantext.Database.Prelude (HasConnectionPool(..))
......@@ -45,11 +47,10 @@ import Gargantext.Database.Query.Table.Node.Error (HasNodeError(..))
import Gargantext.Database.Query.Tree.Error (HasTreeError(..))
import Gargantext.Prelude hiding (to)
import Gargantext.System.Logging (HasLogger(..), Logger, LogLevel(..), MonadLogger(..), withLogger, logMsg, withLoggerIO)
import Gargantext.System.Logging.Loggers
import Gargantext.Utils.Jobs.Monad ( MonadJobStatus(..), JobHandle )
import GHC.IO.Exception (IOException(..), IOErrorType(OtherError))
import Prelude qualified
import System.Log.FastLogger qualified as FL
import Gargantext.System.Logging.Loggers
data WorkerEnv = WorkerEnv
......@@ -57,7 +58,7 @@ data WorkerEnv = WorkerEnv
, _w_env_logger :: ~(Logger (GargM WorkerEnv IOException))
-- the pool is a pool for gargantext db, not pgmq db!
, _w_env_pool :: ~(Pool.Pool PSQL.Connection)
, _w_env_nodeStory :: ~NodeStoryEnv
, _w_env_nodeStory :: ~(NodeStoryEnv BackendInternalError)
, _w_env_mail :: ~Mail.MailConfig
, _w_env_nlp :: ~NLPServerMap
, _w_env_job_state :: ~(TVar (Maybe WorkerJobState))
......@@ -82,12 +83,12 @@ withWorkerEnv settingsFile k = do
-- pool <- newPool $ _gc_database_config cfg
let dbConfig = _gc_database_config cfg
pool <- Pool.newPool $ Pool.setNumStripes (Just 1) $ Pool.defaultPoolConfig (PSQL.connect dbConfig) PSQL.close 60 4
nodeStory_env <- fromDBNodeStoryEnv pool
_w_env_job_state <- newTVarIO Nothing
pure $ WorkerEnv
{ _w_env_pool = pool
{ -- NOTE(adn) I think with the DbTX now we don't need a pool in the env. Remove?
_w_env_pool = pool
, _w_env_logger = logger
, _w_env_nodeStory = nodeStory_env
, _w_env_nodeStory = mkNodeStoryEnv
, _w_env_config = cfg
, _w_env_mail = _gc_mail_config cfg
, _w_env_nlp = nlpServerMap $ _gc_nlp_config cfg
......@@ -116,15 +117,9 @@ instance HasMail WorkerEnv where
instance HasNLPServer WorkerEnv where
nlpServer = to _w_env_nlp
instance HasNodeStoryEnv WorkerEnv where
instance HasNodeStoryEnv WorkerEnv BackendInternalError where
hasNodeStory = to _w_env_nodeStory
instance HasNodeStoryImmediateSaver WorkerEnv where
hasNodeStoryImmediateSaver = hasNodeStory . nse_saver_immediate
instance HasNodeArchiveStoryImmediateSaver WorkerEnv where
hasNodeArchiveStoryImmediateSaver = hasNodeStory . nse_archive_saver_immediate
instance MonadLogger (GargM WorkerEnv IOException) where
getLogger = asks _w_env_logger
......@@ -181,7 +176,7 @@ instance HasLogger WorkerMonad where
newtype instance Logger WorkerMonad =
WorkerMonadLogger { _WorkerMonadLogger :: MonadicStdLogger FL.LogStr IO }
type instance LogInitParams WorkerMonad = LogConfig
type instance LogPayload WorkerMonad = FL.LogStr
type instance LogPayload WorkerMonad = FL.LogStr
initLogger cfg = fmap WorkerMonadLogger $ (liftIO $ monadicStdLogger cfg)
destroyLogger = liftIO . _msl_destroy . _WorkerMonadLogger
logMsg (WorkerMonadLogger ioLogger) lvl msg = liftIO $ _msl_log_msg ioLogger lvl msg
......
......@@ -14,23 +14,25 @@ TODO: NodeError
-}
{-# LANGUAGE ScopedTypeVariables #-}
module Gargantext.Database.Action.Delete
where
import Control.Lens (view)
import Data.Text (unpack)
import Gargantext.Core (HasDBid(..))
import Gargantext.Core.Notifications.CentralExchange.Types (ce_notify, CEMessage(..))
import Gargantext.Core.Notifications.CentralExchange.Types (ce_notify, CEMessage(..), HasCentralExchangeNotification)
import Gargantext.Core.Types.Individu (User(..))
import Gargantext.Database.Action.Share (delFolderTeam)
import Gargantext.Database.Action.User (getUserId)
import Gargantext.Database.Admin.Types.Hyperdata.File ( HyperdataFile(..) )
import Gargantext.Database.Admin.Types.Node ( NodeId, NodeType(..) ) -- (NodeType(..))
import Gargantext.Database.GargDB qualified as GargDB
import Gargantext.Database.Prelude (Cmd, IsDBEnvExtra)
import Gargantext.Database.Prelude
import Gargantext.Database.Query.Table.Node (getNodeWith)
import Gargantext.Database.Query.Table.Node qualified as N (getNode, deleteNode)
import Gargantext.Database.Query.Table.Node.Error (HasNodeError)
import Gargantext.Database.Query.Table.Node.Error (HasNodeError, errorWith)
import Gargantext.Database.Schema.Node
import Gargantext.Prelude
......@@ -38,40 +40,39 @@ import Gargantext.Prelude
-- TODO
-- Delete Corpus children accoring its types
-- Delete NodeList (NodeStory + cbor file)
deleteNode :: (IsDBEnvExtra env, HasNodeError err)
deleteNode :: (HasNodeError err
, IsDBCmd env err m
, HasCentralExchangeNotification env
)
=> User
-> NodeId
-> Cmd env err Int
-> m Int
deleteNode u nodeId = do
node' <- N.getNode nodeId
num <- case (view node_typename node') of
nt | nt == toDBid NodeUser -> panicTrace "[G.D.A.D.deleteNode] Not allowed to delete NodeUser (yet)"
nt | nt == toDBid NodeTeam -> do
uId <- getUserId u
if _node_user_id node' == uId
then N.deleteNode nodeId
else delFolderTeam u nodeId
nt | nt == toDBid NodeFile -> do
node <- getNodeWith nodeId (Proxy :: Proxy HyperdataFile)
let (HyperdataFile { _hff_path = path }) = node ^. node_hyperdata
GargDB.rmFile $ unpack path
N.deleteNode nodeId
_ -> N.deleteNode nodeId
(num, upd_node, cleanup) <- runDBTx $ do
node' <- N.getNode nodeId
(rows, clean_it) <- case view node_typename node' of
nt | nt == toDBid NodeUser -> errorWith "[G.D.A.D.deleteNode] Not allowed to delete NodeUser (yet)"
nt | nt == toDBid NodeTeam -> do
uId <- getUserId u
if _node_user_id node' == uId
then do
r <- N.deleteNode nodeId
pure (r, pure ())
else do
r <- delFolderTeam u nodeId
pure (r, pure ())
nt | nt == toDBid NodeFile -> do
node <- getNodeWith nodeId (Proxy :: Proxy HyperdataFile)
let (HyperdataFile { _hff_path = path }) = node ^. node_hyperdata
r <- N.deleteNode nodeId
pure (r, GargDB.rmFile $ unpack path)
_ -> do
r <- N.deleteNode nodeId
pure (r, pure ())
pure (rows, node', clean_it)
-- | Node was deleted, refresh its parent (if exists)
-- mapM_ (CE.ce_notify . CE.UpdateTreeFirstLevel) nodeIds
case view node_parent_id node' of
Nothing -> return ()
Just pId -> ce_notify $ UpdateTreeFirstLevel pId
return num
-- if hasNodeType node' NodeUser
-- then panic "Not allowed to delete NodeUser (yet)"
-- else if hasNodeType node' NodeTeam
-- then do
-- uId <- getUserId u
-- if _node_user_id node' == uId
-- then N.deleteNode nodeId
-- else delFolderTeam u nodeId
-- else N.deleteNode nodeId
cleanup
for_ (view node_parent_id upd_node) $ ce_notify . UpdateTreeFirstLevel
pure num
This diff is collapsed.
......@@ -11,11 +11,12 @@ Portability : POSIX
{-# LANGUAGE ConstraintKinds #-}
{-# LANGUAGE ConstrainedClassMethods #-}
{-# LANGUAGE ScopedTypeVariables #-}
module Gargantext.Database.Action.Flow.List
where
import Control.Lens ((+~), (%~), at)
import Control.Lens ((+~), (%~), at, view)
import Data.List qualified as List
import Data.Map.Strict (toList)
import Data.Map.Strict qualified as Map
......@@ -23,13 +24,14 @@ import Data.Map.Strict.Patch qualified as PM
import Gargantext.API.Ngrams (saveNodeStory)
import Gargantext.API.Ngrams.Tools (getNodeStory)
import Gargantext.API.Ngrams.Types
import Gargantext.Core.NodeStory (HasNodeStory, a_history, a_state, a_version)
import Gargantext.Core.NodeStory (HasNodeStory, a_history, a_state, a_version, HasNodeStoryEnv (..))
import Gargantext.Core.Text.Ngrams (NgramsType(..))
import Gargantext.Core.Types (HasValidationError(..), assertValid)
import Gargantext.Core.Types.Main (ListType(CandidateTerm))
import Gargantext.Database.Admin.Types.Node (ListId, NodeId)
import Gargantext.Database.Query.Table.NodeNgrams (NodeNgramsPoly(..), NodeNgramsW, listInsertDb,{- getCgramsId -})
import Gargantext.Prelude hiding (toList)
import Gargantext.Database.Prelude
-- FLOW LIST
-- 1. select specific terms of the corpus when compared with others langs
......@@ -80,7 +82,7 @@ flowList_DbRepo :: (HasValidationError err, HasNodeStory env err m)
-> m ListId
flowList_DbRepo lId ngs = do
-- printDebug "listId flowList" lId
_mapCgramsId <- listInsertDb lId toNodeNgramsW (Map.toList ngs)
_mapCgramsId <- runDBTx $ listInsertDb lId toNodeNgramsW (Map.toList ngs)
{-
let toInsert = catMaybes [ (,) <$> (getCgramsId mapCgramsId ntype <$> (unNgramsTerm <$> parent))
<*> getCgramsId mapCgramsId ntype ngram
......@@ -163,7 +165,7 @@ listInsert lId ngs = mapM_ (\(typeList, ngElmts)
-- This function is maintained for its usage in Database.Action.Flow.List.
-- If the given list of ngrams elements contains ngrams already in
-- the repo, they will be ignored.
putListNgrams :: (HasValidationError err, HasNodeStory env err m)
putListNgrams :: forall env err m. (HasValidationError err, HasNodeStory env err m)
=> NodeId
-> NgramsType
-> [NgramsElement]
......@@ -186,24 +188,27 @@ putListNgrams nodeId ngramsType nes = putListNgrams' nodeId ngramsType m
let p1 = NgramsTablePatch . PM.fromMap $ NgramsReplace Nothing . Just <$> ns
(p, p_validity) = PM.singleton ngramsType' p1
assertValid p_validity
{-
-- TODO
v <- currentVersion
q <- commitStatePatch (Versioned v p)
assert empty q
-- What if another commit comes in between?
-- Shall we have a blindCommitStatePatch? It would not ask for a version but just a patch.
-- The modifyMVar_ would test the patch with applicable first.
-- If valid the rest would be atomic and no merge is required.
-}
a <- getNodeStory listId
let a' = a & a_version +~ 1
& a_history %~ (p :)
& a_state . at ngramsType' .~ Just ns
-- liftBase $ atomically $ do
-- r <- readTVar var
-- writeTVar var $
-- r & unNodeStory . at listId . _Just . a_version +~ 1
-- & unNodeStory . at listId . _Just . a_history %~ (p :)
-- & unNodeStory . at listId . _Just . a_state . at ngramsType' .~ Just ns
saveNodeStory listId a'
env <- view hasNodeStory
runDBTx $ do
{-
-- TODO
v <- currentVersion
q <- commitStatePatch (Versioned v p)
assert empty q
-- What if another commit comes in between?
-- Shall we have a blindCommitStatePatch? It would not ask for a version but just a patch.
-- The modifyMVar_ would test the patch with applicable first.
-- If valid the rest would be atomic and no merge is required.
-}
a <- getNodeStory env listId
let a' = a & a_version +~ 1
& a_history %~ (p :)
& a_state . at ngramsType' .~ Just ns
-- liftBase $ atomically $ do
-- r <- readTVar var
-- writeTVar var $
-- r & unNodeStory . at listId . _Just . a_version +~ 1
-- & unNodeStory . at listId . _Just . a_history %~ (p :)
-- & unNodeStory . at listId . _Just . a_state . at ngramsType' .~ Just ns
saveNodeStory env listId a'
......@@ -37,6 +37,7 @@ import Gargantext.Database.Query.Tree.Error (HasTreeError)
import Gargantext.Database.Types (Indexed)
import Gargantext.Prelude
import Gargantext.System.Logging ( MonadLogger )
import Gargantext.Core.Notifications.CentralExchange.Types (HasCentralExchangeNotification)
type FlowCmdM env err m =
......@@ -46,6 +47,7 @@ type FlowCmdM env err m =
, HasValidationError err
, HasTreeError err
, MonadLogger m
, HasCentralExchangeNotification env
)
type FlowCorpus a = ( UniqParameters a
......
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment