Skip to content
Projects
Groups
Snippets
Help
Loading...
Help
Submit feedback
Contribute to GitLab
Sign in
Toggle navigation
haskell-gargantext
Project
Project
Details
Activity
Releases
Cycle Analytics
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Charts
Issues
0
Issues
0
List
Board
Labels
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Charts
Wiki
Wiki
Snippets
Snippets
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Charts
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
Grégoire Locqueville
haskell-gargantext
Commits
ca17a524
Commit
ca17a524
authored
Feb 15, 2023
by
Karen Konou
Browse files
Options
Browse Files
Download
Plain Diff
Merge branch 'dev' into 497-dev-node-write-selection
parents
8577b95d
1dbff541
Changes
35
Hide whitespace changes
Inline
Side-by-side
Showing
35 changed files
with
1017 additions
and
705 deletions
+1017
-705
CHANGELOG.md
CHANGELOG.md
+22
-0
README.md
README.md
+12
-5
track_haskell_deps.hs
bin/track_haskell_deps.hs
+265
-0
cabal.project
cabal.project
+0
-189
gargantext.cabal
gargantext.cabal
+44
-44
pkgs.nix
nix/pkgs.nix
+3
-1
package.yaml
package.yaml
+44
-43
Clustering.hs
src-test/Graph/Clustering.hs
+2
-1
DocumentsFromWriteNodes.hs
src/Gargantext/API/Node/DocumentsFromWriteNodes.hs
+1
-1
Parsers.hs
src/Gargantext/Core/Text/Corpus/Parsers.hs
+29
-17
RIS.hs
src/Gargantext/Core/Text/Corpus/Parsers/RIS.hs
+2
-4
WOS.hs
src/Gargantext/Core/Text/Corpus/Parsers/WOS.hs
+2
-1
Management.sh
src/Gargantext/Core/Text/List/Management.sh
+96
-0
Flow.hs
src/Gargantext/Database/Action/Flow.hs
+5
-4
Pairing.hs
src/Gargantext/Database/Action/Flow/Pairing.hs
+8
-11
Search.hs
src/Gargantext/Database/Action/Search.hs
+53
-33
Share.hs
src/Gargantext/Database/Action/Share.hs
+37
-21
Prelude.hs
src/Gargantext/Database/Prelude.hs
+14
-6
Facet.hs
src/Gargantext/Database/Query/Facet.hs
+70
-42
Types.hs
src/Gargantext/Database/Query/Facet/Types.hs
+25
-25
Join.hs
src/Gargantext/Database/Query/Join.hs
+11
-11
Select.hs
src/Gargantext/Database/Query/Table/Node/Select.hs
+17
-13
NodeContext.hs
src/Gargantext/Database/Query/Table/NodeContext.hs
+34
-27
NodeNode.hs
src/Gargantext/Database/Query/Table/NodeNode.hs
+37
-37
User.hs
src/Gargantext/Database/Query/Table/User.hs
+1
-1
Context.hs
src/Gargantext/Database/Schema/Context.hs
+50
-50
ContextNodeNgrams.hs
src/Gargantext/Database/Schema/ContextNodeNgrams.hs
+21
-18
Ngrams.hs
src/Gargantext/Database/Schema/Ngrams.hs
+12
-12
Node.hs
src/Gargantext/Database/Schema/Node.hs
+50
-50
NodeContext.hs
src/Gargantext/Database/Schema/NodeContext.hs
+15
-15
NodeNode.hs
src/Gargantext/Database/Schema/NodeNode.hs
+13
-14
User.hs
src/Gargantext/Database/Schema/User.hs
+5
-5
Tuple.hs
src/Gargantext/Utils/Tuple.hs
+10
-0
stack.yaml
stack.yaml
+4
-4
weeder.dhall
weeder.dhall
+3
-0
No files found.
CHANGELOG.md
View file @
ca17a524
## Version 0.0.6.9.4.4
*
[
BACK
][
Add optional Accelerate dependency on Darwin for pkgs.nix
]
(https://gitlab.iscpif.fr/gargantext/haskell-gargantext/merge_requests/128)
*
[
BACK
][
Crawlers
]
pubmed clean unused deps removed
*
[
BACK
][
CLEAN
][
Explore cutting down forks/extra-deps of libraries (#180)
](
https://gitlab.iscpif.fr/gargantext/haskell-gargantext/issues/180
)
## Version 0.0.6.9.4.3
*
[
BACK
][
DOC
][
Welcome: Door To enter the project (#177)
](
https://gitlab.iscpif.fr/gargantext/haskell-gargantext/issues/177
)
*
[
FRONT
][
DOC
][
Door to entry: Welcome! (#269)
](
https://gitlab.iscpif.fr/gargantext/purescript-gargantext/issues/269
)
*
[
BACK
][
REFACT
][
Doc Table: count (#175)
](
https://gitlab.iscpif.fr/gargantext/haskell-gargantext/issues/175
)
## Version 0.0.6.9.4.2
*
[
FRONT
][
FIX
][
Regression : on graph (#496)
](
https://gitlab.iscpif.fr/gargantext/purescript-gargantext/issues/496
)
## Version 0.0.6.9.4.1
*
[
FRONT
][
FIX
][
Invite many users without closing the modal (#498)
](
https://gitlab.iscpif.fr/gargantext/purescript-gargantext/issues/498
)
## Version 0.0.6.9.4
*
[
BACK
][
FIX
]
Username and email to lowerCase always. Use migration script please to avoid errors.
*
[
BACK
][
FIX
][
Ngrams Change insert causes Database error (#173)
](
https://gitlab.iscpif.fr/gargantext/haskell-gargantext/issues/173
)
*
[
FRONTED
][
CLEAN
]
Removing Isidore DB for now
*
[
BACK
][
FIX
]
WOS Parser
## Version 0.0.6.9.3
...
...
README.md
View file @
ca17a524
<div
align=
"center"
><img
height=
"180"
src=
"https://gitlab.iscpif.fr/gargantext/purescript-gargantext/raw/dev/dist/images/logo.png"
></div>
# Gargantext with Haskell (Backend instance)



## About the project
GarganText is a collaborative web-decentralized-based macro-service
...
...
@@ -24,7 +29,7 @@ progress. Please report and improve this documentation if you encounter issues.
### Stack setup
You need to install
stack
first:
You need to install
[
Stack (or Haskell Tool Stack)
](
https://docs.haskellstack.org/en/stable/
)
first:
```
shell
curl
-sSL
https://get.haskellstack.org/ | sh
...
...
@@ -33,20 +38,21 @@ curl -sSL https://get.haskellstack.org/ | sh
Verify the installation is complete with
```
shell
stack
--version
Version 2.9.1
```
### With Nix setup
First install
[
nix
](
https://nixos.org/guides/install-nix
.html
)
:
First install
[
Nix
](
https://nixos.org/download
.html
)
:
```
shell
curl
-sSL
https://nixos.org/nix/install | sh
$
sh <
(
curl
-L
https://nixos.org/nix/install
)
--daemon
```
Verify the installation is complete
```
shell
$
nix-env
nix-env
(
Nix
)
2.
3.12
$
nix-env
--version
nix-env
(
Nix
)
2.
12.0
```
And just build:
```
sh
...
...
@@ -114,6 +120,7 @@ then run:
```
sh
stack
--docker
exec
gargantext-init
--
gargantext.ini
```
### Initialization
...
...
bin/track_haskell_deps.hs
0 → 100755
View file @
ca17a524
#!/
usr
/
bin
/
env
stack
-- stack runghc --package shelly --package algebraic-graphs --package async
{-
It's warmly recommended to compile this script as a binary, in order to exploit multicore
parallelism, e.g.:
stack exec ghc -- --make -O2 -threaded scripts/haskell/dependencies.hs
./dependencies +RTS -N
-}
{-# LANGUAGE BangPatterns #-}
{-# LANGUAGE OverloadedStrings #-}
{-# LANGUAGE RankNTypes #-}
{-# LANGUAGE ScopedTypeVariables #-}
{-# LANGUAGE TupleSections #-}
{-# LANGUAGE NumericUnderscores #-}
{-# LANGUAGE ViewPatterns #-}
module
Main
where
import
Algebra.Graph
import
Algebra.Graph.Export.Dot
(
Attribute
(
..
),
Style
(
..
),
export
)
import
Control.Concurrent.Async
(
mapConcurrently
)
import
Control.Concurrent
import
Control.Exception
import
Control.Monad
import
Data.Functor.Identity
import
Data.List
import
qualified
Data.Map.Strict
as
M
import
Data.Monoid
import
qualified
Data.Set
as
Set
import
qualified
Data.Text
as
T
import
Shelly
import
System.IO
import
Text.Printf
import
Text.Read
--------------------------------------------------------------------------------
type
PackageName
=
T
.
Text
data
Version
=
V
[
Int
]
deriving
(
Eq
,
Ord
)
type
Package
=
(
PackageName
,
Version
)
type
DAG
=
Graph
Package
type
DepMap
=
M
.
Map
Package
[
Package
]
type
RevDepMap
=
M
.
Map
Package
[
Package
]
--------------------------------------------------------------------------------
readVersionM
::
Monad
m
=>
(
String
->
m
Int
)
->
T
.
Text
->
m
Version
readVersionM
f
=
fmap
V
.
sequence
.
map
(
f
.
T
.
unpack
)
.
T
.
splitOn
"."
--------------------------------------------------------------------------------
readVersionMaybe
::
T
.
Text
->
Maybe
Version
readVersionMaybe
=
readVersionM
readMaybe
--------------------------------------------------------------------------------
readVersion
::
T
.
Text
->
Version
readVersion
=
runIdentity
.
readVersionM
(
liftM
read
.
pure
)
logScreen
::
MVar
()
->
String
->
IO
()
logScreen
screenLock
msg
=
do
()
<-
takeMVar
screenLock
putStrLn
msg
putMVar
screenLock
()
--------------------------------------------------------------------------------
mkPackage
::
T
.
Text
->
Package
mkPackage
t
=
case
T
.
splitOn
" "
(
T
.
strip
t
)
of
[
name
,
ver
]
->
(
name
,
readVersion
ver
)
_
->
case
T
.
breakOnEnd
"-"
(
T
.
strip
t
)
of
(
""
,
_
)
->
error
$
"mkPackage: "
<>
show
t
(
name
,
ver
)
->
(
T
.
init
name
,
readVersion
ver
)
--------------------------------------------------------------------------------
blacklistedPackages
::
[
T
.
Text
]
blacklistedPackages
=
[]
--------------------------------------------------------------------------------
-- Filter blacklisted packages if they cannot be found by `ghc-pkg`, for some reason.
getTotalPackages
::
IO
[
Package
]
getTotalPackages
=
do
rawList
<-
shelly
$
silently
$
run
"stack"
[
"--nix"
,
"ls"
,
"dependencies"
,
"--test"
,
"--bench"
]
return
$
map
mkPackage
(
filter
(
not
.
blacklisted
)
(
T
.
lines
rawList
))
where
blacklisted
x
=
or
$
map
(
flip
T
.
isInfixOf
x
)
blacklistedPackages
--------------------------------------------------------------------------------
directDependenciesFor
::
MVar
()
->
Package
->
IO
[
Package
]
directDependenciesFor
screenLock
(
name
,
ver
)
=
do
res
<-
try
$
shelly
$
silently
$
run
"stack"
[
"--nix"
,
"exec"
,
"ghc-pkg"
,
"field"
,
name
,
"depends"
]
case
res
of
Left
(
err
::
SomeException
)
->
do
logScreen
screenLock
$
"Got: "
<>
show
err
logScreen
screenLock
"Skipping package..."
pure
mempty
Right
rawOutput
->
case
concatMap
(
T
.
words
.
T
.
replace
"depends:"
mempty
.
T
.
strip
)
(
dropWhile
(
\
l
->
not
(
"depends:"
`
T
.
isInfixOf
`
l
))
$
T
.
lines
rawOutput
)
of
deps
->
do
logScreen
screenLock
$
"Found "
<>
show
(
length
deps
)
<>
" deps for "
<>
show
name
let
!
normalised
=
concatMap
(
map
(
mkPackage
.
normalisePackage
)
.
T
.
splitOn
" "
)
(
takeWhile
(
/=
"depends:"
)
deps
)
pure
$!
normalised
--------------------------------------------------------------------------------
buildPackageMap
::
forall
m
.
Monad
m
=>
(
Package
->
m
[
Package
])
->
[
Package
]
->
m
DepMap
buildPackageMap
_
[]
=
return
M
.
empty
buildPackageMap
f
pkgs
=
go
pkgs
M
.
empty
where
go
::
[
Package
]
->
DepMap
->
m
DepMap
go
[]
depMap
=
return
depMap
go
(
pkg
:
xs
)
depMap
=
do
directDeps
<-
f
pkg
let
!
newMap
=
M
.
insert
pkg
directDeps
$!
depMap
go
xs
newMap
--------------------------------------------------------------------------------
buildDependencyMap
::
[
Package
]
->
IO
DepMap
buildDependencyMap
allDeps
=
do
screenLock
<-
newEmptyMVar
putMVar
screenLock
()
mapAsList
<-
mapConcurrently
(
\
pkg
->
(
pkg
,)
<$>
directDependenciesFor
screenLock
pkg
)
allDeps
return
$
M
.
fromList
mapAsList
--------------------------------------------------------------------------------
buildReverseDependencyMap
::
[
Package
]
->
DepMap
->
RevDepMap
buildReverseDependencyMap
allDeps
depMap
=
runIdentity
$
buildPackageMap
(
Identity
.
reverseDependenciesFor
allDeps
depMap
)
allDeps
--------------------------------------------------------------------------------
buildUniqueDependencyMap
::
[
Package
]
->
DepMap
->
RevDepMap
->
DepMap
buildUniqueDependencyMap
allDeps
depMap
revMap
=
runIdentity
$
buildPackageMap
(
Identity
.
uniqueDependenciesFor
depMap
revMap
)
allDeps
--------------------------------------------------------------------------------
buildDependencyDAG
::
[
Package
]
->
DepMap
->
IO
DAG
buildDependencyDAG
allPkgs
depMap
=
go
allPkgs
Set
.
empty
where
go
::
[
Package
]
->
Set
.
Set
(
Package
,
Package
)
->
IO
DAG
go
[]
dagEdges
=
return
.
edges
.
Set
.
toList
$
dagEdges
go
(
pkg
:
xs
)
dagEdges
=
do
let
directDeps
=
M
.
findWithDefault
mempty
pkg
depMap
let
!
newDag
=
dagEdges
<>
Set
.
fromList
(
map
(
pkg
,)
directDeps
)
go
xs
newDag
--------------------------------------------------------------------------------
-- | >>> normalisePackage "conduit-1.2.10-GgLn1U1QYcf9wsQecuZ1A4"
-- "conduit-1.2.10"
-- >>> normalisePackage "conduit-1.2.10"
-- "conduit-1.2.10"
normalisePackage
::
T
.
Text
->
T
.
Text
normalisePackage
"rts"
=
"rts-0.0.0.0"
normalisePackage
txt
=
case
T
.
breakOnEnd
"-"
txt
of
(
x
,
xs
)
->
case
readVersionMaybe
xs
of
Just
_
->
txt
Nothing
->
if
x
==
""
then
error
(
"normalisePackage: "
<>
show
txt
)
else
T
.
init
x
--------------------------------------------------------------------------------
unavoidableDeps
::
Package
->
Package
->
Bool
unavoidableDeps
myself
x
=
and
[
x
/=
myself
,
not
(
"gargantext"
`
T
.
isInfixOf
`
(
fst
x
))
]
--------------------------------------------------------------------------------
-- | Filter "unavoilable" dependencies like the ones of the gargantext family.
reverseDependenciesFor
::
[
Package
]
->
DepMap
->
Package
->
[
Package
]
reverseDependenciesFor
allDeps
directDeps
pkg
=
go
(
filter
(
unavoidableDeps
pkg
)
allDeps
)
mempty
where
go
[]
!
revDeps
=
revDeps
go
(
x
:
xs
)
!
revDeps
=
case
reachableFrom
x
of
True
->
go
xs
(
x
:
revDeps
)
False
->
go
xs
revDeps
-- For each package x, check the graph to see if there is a path going
-- from x to `pkg`. If there is, we found a reverse dep.
reachableFrom
::
Package
->
Bool
reachableFrom
directDep
=
let
depsForThis
=
M
.
findWithDefault
mempty
directDep
directDeps
in
case
pkg
`
elem
`
depsForThis
of
True
->
True
False
->
go
depsForThis
where
go
::
[
Package
]
->
Bool
go
[]
=
False
go
xs
=
any
reachableFrom
xs
--------------------------------------------------------------------------------
-- | Compute the "unique direct dependencies", which are the dependencies that
-- only this package introduces into the project.
-- In other terms, we need to count for each DIRECT dependency, the number of
-- REVERSE dependencies. If it's one, and it's the package in question, it
-- means that removing that dependency would also remove the associated package.
uniqueDependenciesFor
::
DepMap
->
RevDepMap
->
Package
->
[
Package
]
uniqueDependenciesFor
directDeps
revDeps
pkg
=
go
(
M
.
findWithDefault
mempty
pkg
directDeps
)
[]
where
go
[]
!
deps
=
deps
go
(
d
:
ds
)
!
deps
=
case
M
.
findWithDefault
mempty
d
revDeps
of
[
x
]
|
x
==
pkg
->
go
ds
(
d
:
deps
)
_
->
go
ds
deps
--------------------------------------------------------------------------------
style
::
Style
Package
String
style
=
Style
{
graphName
=
""
,
preamble
=
mempty
,
graphAttributes
=
[
"label"
:=
"Example"
,
"labelloc"
:=
"top"
]
,
defaultVertexAttributes
=
[
"shape"
:=
"circle"
]
,
defaultEdgeAttributes
=
mempty
,
vertexName
=
\
(
name
,
_
)
->
T
.
unpack
name
,
vertexAttributes
=
\
_
->
[
"color"
:=
"blue"
]
,
edgeAttributes
=
\
_
_
->
[
"style"
:=
"dashed"
]
}
--------------------------------------------------------------------------------
dottify
::
DAG
->
IO
()
dottify
dag
=
writeFile
"dep_dot.graphviz"
(
export
style
dag
)
--------------------------------------------------------------------------------
main
::
IO
()
main
=
do
hSetBuffering
System
.
IO
.
stdout
NoBuffering
hSetBuffering
System
.
IO
.
stderr
NoBuffering
allDeps
<-
getTotalPackages
putStr
"Building direct dependency map..."
directDepMap
<-
buildDependencyMap
allDeps
putStrLn
"ok."
let
revDepMap
=
buildReverseDependencyMap
allDeps
directDepMap
let
uniqueDepMap
=
buildUniqueDependencyMap
allDeps
directDepMap
revDepMap
let
tableHeader
=
printf
"%-40s"
(
"Package"
::
String
)
<>
printf
"%-20s"
(
"Direct deps"
::
String
)
<>
printf
"%-20s"
(
"Unique deps"
::
String
)
<>
printf
"%-70s"
(
"Reverse deps"
::
String
)
let
tableEntry
pkg
(
totalDeps
,
uniqueDeps
)
revDeps
=
printf
"%-40s"
(
T
.
unpack
pkg
)
<>
printf
"%-20s"
(
show
totalDeps
)
<>
printf
"%-20s"
(
show
uniqueDeps
)
<>
printf
"%-70s
\n
"
(
T
.
unpack
$
showRevDeps
revDeps
)
putStrLn
tableHeader
let
depsMap
=
M
.
map
length
directDepMap
let
sortedDepList
=
reverse
(
sortOn
snd
$
M
.
toList
depsMap
)
let
mkTableEntry
(
pkg
@
(
pkgName
,
_
),
deps
)
=
let
revDeps
=
M
.
findWithDefault
mempty
pkg
revDepMap
uniqueDeps
=
M
.
findWithDefault
mempty
pkg
uniqueDepMap
in
tableEntry
pkgName
(
deps
,
length
uniqueDeps
)
revDeps
forM_
sortedDepList
(
putStr
.
mkTableEntry
)
-- Display the total deps
putStrLn
$
"Total project deps: "
<>
(
show
$
length
allDeps
+
length
blacklistedPackages
)
showRevDeps
::
[
Package
]
->
T
.
Text
showRevDeps
[]
=
T
.
pack
$
printf
"%-4d%s"
(
0
::
Int
)
(
"(possibly gargantext depends on it)"
::
String
)
showRevDeps
[(
pkgName
,
_
)]
=
T
.
pack
$
printf
"%-4d%s"
(
1
::
Int
)
(
"("
<>
T
.
unpack
pkgName
<>
")"
)
showRevDeps
xs
|
length
xs
<=
5
=
T
.
pack
$
printf
"%-4d%s"
(
length
xs
)
(
T
.
unpack
$
"("
<>
T
.
intercalate
","
(
map
fst
xs
)
<>
")"
)
|
otherwise
=
T
.
pack
$
printf
"%-4d%s"
(
length
xs
)
(
T
.
unpack
$
"("
<>
T
.
intercalate
","
(
map
fst
(
take
5
xs
))
<>
",...)"
)
cabal.project
deleted
100644 → 0
View file @
8577b95d
packages
:
.
allow
-
newer
:
base
,
accelerate
,
servant
,
time
,
classy
-
prelude
allow
-
newer
:
binary
,
primitive
,
vector
--
Patches
source
-
repository
-
package
type
:
git
location
:
https
://
github
.
com
/
alpmestan
/
servant
-
job
.
git
tag
:
ceb251b91e8ec1804198422a3cdbdab08d843b79
source
-
repository
-
package
type
:
git
location
:
https
://
github
.
com
/
alpmestan
/
ekg
-
json
.
git
tag
:
fd7e5d7325939103cd87d0dc592faf644160341c
source
-
repository
-
package
type
:
git
location
:
https
://
github
.
com
/
haskell
-
servant
/
servant
.
git
tag
:
c2af6e775d1d36f2011d43aff230bb502f8fba63
subdir
:
servant
/
servant
-
server
/
servant
-
client
-
core
/
servant
-
client
/
servant
-
auth
/
servant
-
auth
/
servant
-
auth
/
servant
-
auth
-
client
/
servant
-
auth
/
servant
-
auth
-
server
/
source
-
repository
-
package
type
:
git
location
:
https
://
github
.
com
/
delanoe
/
patches
-
map
.
git
tag
:
76
cae88f367976ff091e661ee69a5c3126b94694
source
-
repository
-
package
type
:
git
location
:
https
://
gitlab
.
iscpif
.
fr
/
gargantext
/
patches
-
class
.
git
tag
:
271
ba32d6c940029dc653354dd7974a819f48e77
source
-
repository
-
package
type
:
git
location
:
https
://
gitlab
.
iscpif
.
fr
/
cgenie
/
haskell
-
gargantext
-
prelude
.
git
tag
:
6
bfdb29e9a576472c7fd7ebe648ad101e5b3927f
--
External
Data
API
connectors
source
-
repository
-
package
type
:
git
location
:
https
://
gitlab
.
iscpif
.
fr
/
gargantext
/
crawlers
/
pubmed
.
git
tag
:
9
cdba6423decad5acfacb0f274212fd8723ce734
source
-
repository
-
package
type
:
git
location
:
https
://
gitlab
.
iscpif
.
fr
/
gargantext
/
crawlers
/
isidore
.
git
tag
:
3
db385e767d2100d8abe900833c6e7de3ac55e1b
source
-
repository
-
package
type
:
git
location
:
https
://
gitlab
.
iscpif
.
fr
/
gargantext
/
crawlers
/
istex
.
git
tag
:
daeae80365250c4bd539f0a65e271f9aa37f731f
source
-
repository
-
package
type
:
git
location
:
https
://
gitlab
.
iscpif
.
fr
/
gargantext
/
crawlers
/
hal
.
git
tag
:
020f5f9
b308f5c23c925aedf5fb11f8b4728fb19
source
-
repository
-
package
type
:
git
location
:
https
://
gitlab
.
iscpif
.
fr
/
gargantext
/
crawlers
/
arxiv
-
api
.
git
tag
:
f3e517cc40d92e282c5245b23d253d2ca3f802e5
--
Graphs
source
-
repository
-
package
type
:
git
location
:
https
://
github
.
com
/
alpmestan
/
haskell
-
igraph
.
git
tag
:
9f55
eb36639c8e0965c8bc539a57738869f33e9a
source
-
repository
-
package
type
:
git
location
:
https
://
gitlab
.
iscpif
.
fr
/
gargantext
/
haskell
-
infomap
.
git
tag
:
6
d1d60b952b9b2b272b58fc5539700fd8890ac88
source
-
repository
-
package
type
:
git
location
:
https
://
gitlab
.
iscpif
.
fr
/
gargantext
/
gargantext
-
graph
.
git
tag
:
f41ee8b53c3264e5aa5adc06b2e5b293d2a8c474
--
Data
mining
source
-
repository
-
package
type
:
git
location
:
https
://
github
.
com
/
delanoe
/
data
-
time
-
segment
.
git
tag
:
10
a416b9f6c443866b36479c3441ebb3bcdeb7ef
source
-
repository
-
package
type
:
git
location
:
https
://
gitlab
.
iscpif
.
fr
/
gargantext
/
hlcm
.
git
tag
:
6f0595
d2421005837d59151a8b26eee83ebb67b5
source
-
repository
-
package
type
:
git
location
:
https
://
github
.
com
/
delanoe
/
hstatistics
.
git
tag
:
90
eef7604bb230644c2246eccd094d7bfefcb135
source
-
repository
-
package
type
:
git
location
:
https
://
github
.
com
/
paulrzcz
/
HSvm
.
git
tag
:
3f
e28b683aba5ddf05e3b5f8eced0bd05c5a29f9
--
servant
source
-
repository
-
package
type
:
git
location
:
https
://
github
.
com
/
delanoe
/
servant
-
static
-
th
.
git
tag
:
8
cb8aaf2962ad44d319fcea48442e4397b3c49e8
--
source
-
repository
-
package
--
type
:
git
--
location
:
https
://
github
.
com
/
alpmestan
/
servant
-
job
.
git
--
tag
:
e9a4c57ca3ddee450627ed251df942effb27e4be
--
Database
libraries
source
-
repository
-
package
type
:
git
location
:
https
://
github
.
com
/
delanoe
/
haskell
-
opaleye
.
git
tag
:
756
cb90f4ce725463d957bc899d764e0ed73738c
source
-
repository
-
package
type
:
git
location
:
https
://
github
.
com
/
delanoe
/
hsparql
.
git
tag
:
308
c74b71a1abb0a91546fa57d353131248e3a7f
source
-
repository
-
package
type
:
git
location
:
https
://
github
.
com
/
alpmestan
/
rdf4h
.
git
tag
:
fc24987d3af348a677748f226e48d64779a694e9
--
numerical
computing
source
-
repository
-
package
type
:
git
location
:
https
://
github
.
com
/
alpmestan
/
accelerate
.
git
tag
:
640
b5af87cea94b61c7737d878e6f7f2fca5c015
source
-
repository
-
package
type
:
git
location
:
https
://
gitlab
.
iscpif
.
fr
/
amestanogullari
/
accelerate
-
utility
.
git
tag
:
a3875fe652d3bb5acb522674c22c6c814c1b4ad0
source
-
repository
-
package
type
:
git
location
:
https
://
github
.
com
/
alpmestan
/
accelerate
-
arithmetic
.
git
tag
:
a110807651036ca2228a76507ee35bbf7aedf87a
source
-
repository
-
package
type
:
git
location
:
https
://
github
.
com
/
alpmestan
/
accelerate
-
llvm
.
git
tag
:
944f5
a4aea35ee6aedb81ea754bf46b131fce9e3
subdir
:
accelerate
-
llvm
/
accelerate
-
llvm
-
native
/
source
-
repository
-
package
type
:
git
location
:
https
://
github
.
com
/
alpmestan
/
hmatrix
.
git
tag
:
b9fca8beee0f23c17a6b2001ec834d071709e6e7
subdir
:
packages
/
base
/
--
Wikidata
source
-
repository
-
package
type
:
git
location
:
https
://
github
.
com
/
rspeer
/
wikiparsec
.
git
tag
:
9637
a82344bb70f7fa8f02e75db3c081ccd434ce
--
numerical
computing
source
-
repository
-
package
type
:
git
location
:
https
://
github
.
com
/
alpmestan
/
sparse
-
linear
.
git
tag
:
bc6ca8058077b0b5702ea4b88bd4189cfcad267a
subdir
:
sparse
-
linear
/
constraints
:
unordered
-
containers
==
0.2.14
.*,
servant
-
ekg
==
0.3.1
,
time
==
1.9.3
,
stm
==
2.5.0.1
,
vector
==
0.12.3.0
,
eigen
==
3.3.7.0
,
cborg
==
0.2.6.0
,
primitive
==
0.7.3.0
package
accelerate
flags
:
+
debug
\ No newline at end of file
gargantext.cabal
View file @
ca17a524
...
...
@@ -5,7 +5,7 @@ cabal-version: 1.12
-- see: https://github.com/sol/hpack
name: gargantext
version:
0.0.6.9
.4
version:
0.0.6.9.4
.4
synopsis: Search, map, share
description: Please see README.md
category: Data
...
...
@@ -30,61 +30,34 @@ library
exposed-modules:
Gargantext
Gargantext.API
Gargantext.API.Admin.Auth.Types
Gargantext.API.Admin.EnvTypes
Gargantext.API.Admin.Settings
Gargantext.API.Admin.Types
Gargantext.API.Dev
Gargantext.API.HashedResponse
Gargantext.API.Node
Gargantext.API.Node.Share
Gargantext.API.Node.File
Gargantext.API.Ngrams
Gargantext.API.Ngrams.Prelude
Gargantext.API.Ngrams.Tools
Gargantext.API.Ngrams.Types
Gargantext.API.Ngrams.Prelude
Gargantext.API.Admin.Settings
Gargantext.API.Admin.EnvTypes
Gargantext.API.Admin.Auth.Types
Gargantext.API.Admin.Types
Gargantext.API.Node
Gargantext.API.Node.File
Gargantext.API.Node.Share
Gargantext.API.Prelude
Gargantext.Core
Gargantext.Core.NodeStory
Gargantext.Core.Methods.Similarities
Gargantext.Core.Types
Gargantext.Core.Types.Individu
Gargantext.Core.Types.Main
Gargantext.Core.Utils
Gargantext.Core.Utils.Prefix
Gargantext.Utils.Jobs
Gargantext.Utils.Jobs.API
Gargantext.Utils.Jobs.Map
Gargantext.Utils.Jobs.Monad
Gargantext.Utils.Jobs.Queue
Gargantext.Utils.Jobs.Settings
Gargantext.Utils.Jobs.State
Gargantext.Utils.SpacyNLP
Gargantext.Database.Action.Flow
Gargantext.Database.Action.Flow.Types
Gargantext.Database.Action.User.New
Gargantext.Database.Query.Table.User
Gargantext.Database.Query.Table.Node
Gargantext.Database.Query.Table.Node.UpdateOpaleye
Gargantext.Database.Query.Table.NgramsPostag
Gargantext.Database.Schema.Ngrams
Gargantext.Database.Prelude
Gargantext.Database.Admin.Trigger.Init
Gargantext.Database.Admin.Config
Gargantext.Database.Admin.Types.Hyperdata
Gargantext.Database.Admin.Types.Node
Gargantext.Defaults
Gargantext.Core.NodeStory
Gargantext.Core.Text
Gargantext.Core.Text.Context
Gargantext.Core.Text.Corpus.Parsers
Gargantext.Core.Text.Corpus.Parsers.Date.Parsec
Gargantext.Core.Text.Corpus.API
Gargantext.Core.Text.Corpus.Parsers
Gargantext.Core.Text.Corpus.Parsers.CSV
Gargantext.Core.Text.Corpus.Parsers.Date.Parsec
Gargantext.Core.Text.List.Formats.CSV
Gargantext.Core.Text.Metrics
Gargantext.Core.Text.Metrics.TFICF
Gargantext.Core.Text.Metrics.CharByChar
Gargantext.Core.Text.Metrics.Count
Gargantext.Core.Text.Metrics.TFICF
Gargantext.Core.Text.Prepare
Gargantext.Core.Text.Search
Gargantext.Core.Text.Terms
...
...
@@ -94,18 +67,47 @@ library
Gargantext.Core.Text.Terms.Multi.Lang.Fr
Gargantext.Core.Text.Terms.Multi.RAKE
Gargantext.Core.Text.Terms.WithList
Gargantext.Core.Types
Gargantext.Core.Types.Individu
Gargantext.Core.Types.Main
Gargantext.Core.Utils
Gargantext.Core.Utils.Prefix
Gargantext.Core.Viz.Graph
Gargantext.Core.Viz.Graph.Index
Gargantext.Core.Viz.Graph.Tools
Gargantext.Core.Viz.Graph.Tools.IGraph
Gargantext.Core.Viz.Graph.
Index
Gargantext.Core.Viz.Graph.
Types
Gargantext.Core.Viz.Phylo
Gargantext.Core.Viz.Phylo.API
Gargantext.Core.Viz.Phylo.API.Tools
Gargantext.Core.Viz.Phylo.PhyloExport
Gargantext.Core.Viz.Phylo.PhyloMaker
Gargantext.Core.Viz.Phylo.PhyloTools
Gargantext.Core.Viz.Phylo.PhyloExport
Gargantext.Core.Viz.Phylo.SynchronicClustering
Gargantext.Core.Viz.Types
Gargantext.Database.Action.Flow
Gargantext.Database.Action.Flow.Types
Gargantext.Database.Action.User.New
Gargantext.Database.Admin.Config
Gargantext.Database.Admin.Trigger.Init
Gargantext.Database.Admin.Types.Hyperdata
Gargantext.Database.Admin.Types.Node
Gargantext.Database.Prelude
Gargantext.Database.Query.Table.NgramsPostag
Gargantext.Database.Query.Table.Node
Gargantext.Database.Query.Table.Node.UpdateOpaleye
Gargantext.Database.Query.Table.User
Gargantext.Database.Schema.Ngrams
Gargantext.Defaults
Gargantext.Utils.Jobs
Gargantext.Utils.Jobs.API
Gargantext.Utils.Jobs.Map
Gargantext.Utils.Jobs.Monad
Gargantext.Utils.Jobs.Queue
Gargantext.Utils.Jobs.Settings
Gargantext.Utils.Jobs.State
Gargantext.Utils.SpacyNLP
Gargantext.Utils.Tuple
other-modules:
Gargantext.API.Admin.Auth
Gargantext.API.Admin.FrontEnd
...
...
@@ -233,7 +235,6 @@ library
Gargantext.Core.Viz.Graph.Legend
Gargantext.Core.Viz.Graph.PatriciaTreeTypes
Gargantext.Core.Viz.Graph.Tools.Infomap
Gargantext.Core.Viz.Graph.Types
Gargantext.Core.Viz.Graph.Utils
Gargantext.Core.Viz.LegacyPhylo
Gargantext.Core.Viz.Phylo.Example
...
...
@@ -430,7 +431,6 @@ library
, matrix
, monad-control
, monad-logger
, monad-logger-aeson
, morpheus-graphql
, morpheus-graphql-app
, morpheus-graphql-core
...
...
nix/pkgs.nix
View file @
ca17a524
...
...
@@ -32,7 +32,9 @@ rec {
icu
graphviz
llvm_9
];
]
++
(
lib
.
optionals
stdenv
.
isDarwin
[
darwin
.
apple_sdk
.
frameworks
.
Accelerate
]);
libPaths
=
pkgs
.
lib
.
makeLibraryPath
nonhsBuildInputs
;
shellHook
=
''
export LD_LIBRARY_PATH="
${
pkgs
.
gfortran7
.
cc
.
lib
}
:
${
libPaths
}
:$LD_LIBRARY_PATH"
...
...
package.yaml
View file @
ca17a524
...
...
@@ -6,7 +6,7 @@ name: gargantext
# | | | +----- Layers * : New versions with API additions
# | | | | +--- Layers * : New versions without API breaking changes
# | | | | |
version
:
'
0.0.6.9.4'
version
:
'
0.0.6.9.4
.4
'
synopsis
:
Search, map, share
description
:
Please see README.md
category
:
Data
...
...
@@ -55,61 +55,34 @@ library:
exposed-modules
:
-
Gargantext
-
Gargantext.API
-
Gargantext.API.Admin.Auth.Types
-
Gargantext.API.Admin.EnvTypes
-
Gargantext.API.Admin.Settings
-
Gargantext.API.Admin.Types
-
Gargantext.API.Dev
-
Gargantext.API.HashedResponse
-
Gargantext.API.Node
-
Gargantext.API.Node.Share
-
Gargantext.API.Node.File
-
Gargantext.API.Ngrams
-
Gargantext.API.Ngrams.Prelude
-
Gargantext.API.Ngrams.Tools
-
Gargantext.API.Ngrams.Types
-
Gargantext.API.Ngrams.Prelude
-
Gargantext.API.Admin.Settings
-
Gargantext.API.Admin.EnvTypes
-
Gargantext.API.Admin.Auth.Types
-
Gargantext.API.Admin.Types
-
Gargantext.API.Node
-
Gargantext.API.Node.File
-
Gargantext.API.Node.Share
-
Gargantext.API.Prelude
-
Gargantext.Core
-
Gargantext.Core.NodeStory
-
Gargantext.Core.Methods.Similarities
-
Gargantext.Core.Types
-
Gargantext.Core.Types.Individu
-
Gargantext.Core.Types.Main
-
Gargantext.Core.Utils
-
Gargantext.Core.Utils.Prefix
-
Gargantext.Utils.Jobs
-
Gargantext.Utils.Jobs.API
-
Gargantext.Utils.Jobs.Map
-
Gargantext.Utils.Jobs.Monad
-
Gargantext.Utils.Jobs.Queue
-
Gargantext.Utils.Jobs.Settings
-
Gargantext.Utils.Jobs.State
-
Gargantext.Utils.SpacyNLP
-
Gargantext.Database.Action.Flow
-
Gargantext.Database.Action.Flow.Types
-
Gargantext.Database.Action.User.New
-
Gargantext.Database.Query.Table.User
-
Gargantext.Database.Query.Table.Node
-
Gargantext.Database.Query.Table.Node.UpdateOpaleye
-
Gargantext.Database.Query.Table.NgramsPostag
-
Gargantext.Database.Schema.Ngrams
-
Gargantext.Database.Prelude
-
Gargantext.Database.Admin.Trigger.Init
-
Gargantext.Database.Admin.Config
-
Gargantext.Database.Admin.Types.Hyperdata
-
Gargantext.Database.Admin.Types.Node
-
Gargantext.Defaults
-
Gargantext.Core.NodeStory
-
Gargantext.Core.Text
-
Gargantext.Core.Text.Context
-
Gargantext.Core.Text.Corpus.Parsers
-
Gargantext.Core.Text.Corpus.Parsers.Date.Parsec
-
Gargantext.Core.Text.Corpus.API
-
Gargantext.Core.Text.Corpus.Parsers
-
Gargantext.Core.Text.Corpus.Parsers.CSV
-
Gargantext.Core.Text.Corpus.Parsers.Date.Parsec
-
Gargantext.Core.Text.List.Formats.CSV
-
Gargantext.Core.Text.Metrics
-
Gargantext.Core.Text.Metrics.TFICF
-
Gargantext.Core.Text.Metrics.CharByChar
-
Gargantext.Core.Text.Metrics.Count
-
Gargantext.Core.Text.Metrics.TFICF
-
Gargantext.Core.Text.Prepare
-
Gargantext.Core.Text.Search
-
Gargantext.Core.Text.Terms
...
...
@@ -119,18 +92,47 @@ library:
-
Gargantext.Core.Text.Terms.Multi.Lang.Fr
-
Gargantext.Core.Text.Terms.Multi.RAKE
-
Gargantext.Core.Text.Terms.WithList
-
Gargantext.Core.Types
-
Gargantext.Core.Types.Individu
-
Gargantext.Core.Types.Main
-
Gargantext.Core.Utils
-
Gargantext.Core.Utils.Prefix
-
Gargantext.Core.Viz.Graph
-
Gargantext.Core.Viz.Graph.Index
-
Gargantext.Core.Viz.Graph.Tools
-
Gargantext.Core.Viz.Graph.Tools.IGraph
-
Gargantext.Core.Viz.Graph.
Index
-
Gargantext.Core.Viz.Graph.
Types
-
Gargantext.Core.Viz.Phylo
-
Gargantext.Core.Viz.Phylo.API
-
Gargantext.Core.Viz.Phylo.API.Tools
-
Gargantext.Core.Viz.Phylo.PhyloExport
-
Gargantext.Core.Viz.Phylo.PhyloMaker
-
Gargantext.Core.Viz.Phylo.PhyloTools
-
Gargantext.Core.Viz.Phylo.PhyloExport
-
Gargantext.Core.Viz.Phylo.SynchronicClustering
-
Gargantext.Core.Viz.Types
-
Gargantext.Database.Action.Flow
-
Gargantext.Database.Action.Flow.Types
-
Gargantext.Database.Action.User.New
-
Gargantext.Database.Admin.Config
-
Gargantext.Database.Admin.Trigger.Init
-
Gargantext.Database.Admin.Types.Hyperdata
-
Gargantext.Database.Admin.Types.Node
-
Gargantext.Database.Prelude
-
Gargantext.Database.Query.Table.NgramsPostag
-
Gargantext.Database.Query.Table.Node
-
Gargantext.Database.Query.Table.Node.UpdateOpaleye
-
Gargantext.Database.Query.Table.User
-
Gargantext.Database.Schema.Ngrams
-
Gargantext.Defaults
-
Gargantext.Utils.Jobs
-
Gargantext.Utils.Jobs.API
-
Gargantext.Utils.Jobs.Map
-
Gargantext.Utils.Jobs.Monad
-
Gargantext.Utils.Jobs.Queue
-
Gargantext.Utils.Jobs.Settings
-
Gargantext.Utils.Jobs.State
-
Gargantext.Utils.SpacyNLP
-
Gargantext.Utils.Tuple
dependencies
:
-
HSvm
-
KMP
...
...
@@ -214,7 +216,6 @@ library:
-
matrix
-
monad-control
-
monad-logger
-
monad-logger-aeson
-
morpheus-graphql
-
morpheus-graphql-app
-
morpheus-graphql-core
...
...
src-test/Graph/Clustering.hs
View file @
ca17a524
...
...
@@ -13,7 +13,8 @@ Portability : POSIX
module
Graph.Clustering
where
import
Gargantext.API.Ngrams.Types
(
NgramsTerm
(
..
))
import
Gargantext.Core.Viz.Graph
(
Graph
(
..
),
Strength
(
..
))
import
Gargantext.Core.Viz.Graph
import
Gargantext.Core.Viz.Graph.Types
import
Gargantext.Core.Viz.Graph.Tools
(
doSimilarityMap
)
import
Gargantext.Core.Viz.Graph.Tools.IGraph
(
spinglass
)
import
Gargantext.Prelude
...
...
src/Gargantext/API/Node/DocumentsFromWriteNodes.hs
View file @
ca17a524
...
...
@@ -109,7 +109,7 @@ documentsFromWriteNodes uId nId Params { selection, lang, paragraphs } logStatus
let
parsedE
=
(
\
(
node
,
contents
)
->
hyperdataDocumentFromFrameWrite
lang
paragraphs
(
node
,
contents
))
<$>
frameWritesWithContents
let
parsed
=
List
.
concat
$
rights
parsedE
printDebug
"DocumentsFromWriteNodes: uId"
uId
_
<-
flowDataText
(
RootId
(
NodeId
uId
))
(
DataNew
(
Just
$
fromIntegral
$
length
parsed
,
yieldMany
parsed
))
(
Multi
lang
)
...
...
src/Gargantext/Core/Text/Corpus/Parsers.hs
View file @
ca17a524
...
...
@@ -20,14 +20,15 @@ please follow the types.
{-# LANGUAGE PackageImports #-}
module
Gargantext.Core.Text.Corpus.Parsers
(
FileFormat
(
..
),
FileType
(
..
),
clean
,
parseFile
,
cleanText
,
parseFormatC
)
module
Gargantext.Core.Text.Corpus.Parsers
(
FileFormat
(
..
),
FileType
(
..
),
clean
,
parseFile
,
cleanText
,
parseFormatC
,
splitOn
)
where
-- import Gargantext.Core.Text.Learn (detectLangDefault)
import
"zip"
Codec.Archive.Zip
(
withArchive
,
getEntry
,
getEntries
)
import
Conduit
import
Control.Concurrent.Async
as
CCA
(
mapConcurrently
)
import
Control.Monad.Trans.Control
(
MonadBaseControl
)
import
Control.Monad
(
join
)
import
Control.Monad.Trans.Control
(
MonadBaseControl
)
import
Data.Attoparsec.ByteString
(
parseOnly
,
Parser
)
import
Data.Either
(
Either
(
..
))
import
Data.Either.Extra
(
partitionEithers
)
...
...
@@ -38,25 +39,24 @@ import Data.String()
import
Data.Text
(
Text
,
intercalate
,
pack
,
unpack
)
import
Data.Text.Encoding
(
decodeUtf8
)
import
Data.Tuple.Extra
(
both
,
first
,
second
)
import
Gargantext.API.Node.Corpus.New.Types
(
FileFormat
(
..
))
import
Gargantext.Core
(
Lang
(
..
))
import
Gargantext.Core.Text.Corpus.Parsers.CSV
(
parseHal
,
parseCsv
,
parseCsvC
)
import
Gargantext.Core.Text.Corpus.Parsers.RIS.Presse
(
presseEnrich
)
import
Gargantext.Database.Admin.Types.Hyperdata
(
HyperdataDocument
(
..
))
import
Gargantext.Prelude
import
System.FilePath
(
FilePath
(),
takeExtension
)
import
System.IO.Temp
(
emptySystemTempFile
)
import
qualified
Data.ByteString
as
DB
import
qualified
Data.ByteString.Char8
as
DBC
import
qualified
Data.ByteString.Lazy
as
DBL
import
qualified
Data.Map
as
DM
import
qualified
Data.Text
as
DT
import
qualified
Prelude
import
System.IO.Temp
(
emptySystemTempFile
)
import
Gargantext.API.Node.Corpus.New.Types
(
FileFormat
(
..
))
import
Gargantext.Core
(
Lang
(
..
))
import
Gargantext.Database.Admin.Types.Hyperdata
(
HyperdataDocument
(
..
))
import
Gargantext.Prelude
import
Gargantext.Core.Text.Corpus.Parsers.CSV
(
parseHal
,
parseCsv
,
parseCsvC
)
import
Gargantext.Core.Text.Corpus.Parsers.RIS.Presse
(
presseEnrich
)
-- import Gargantext.Core.Text.Learn (detectLangDefault)
import
qualified
Gargantext.Core.Text.Corpus.Parsers.Date
as
Date
import
qualified
Gargantext.Core.Text.Corpus.Parsers.RIS
as
RIS
import
qualified
Gargantext.Core.Text.Corpus.Parsers.WOS
as
WOS
import
qualified
Prelude
import
Gargantext.Database.Query.Table.Ngrams
(
NgramsType
(
..
))
------------------------------------------------------------------------
type
ParseError
=
String
...
...
@@ -168,12 +168,15 @@ parseFormatC _ _ _ = undefined
parseFile
::
FileType
->
FileFormat
->
FilePath
->
IO
(
Either
Prelude
.
String
[
HyperdataDocument
])
parseFile
CsvHal
Plain
p
=
parseHal
p
parseFile
CsvGargV3
Plain
p
=
parseCsv
p
parseFile
RisPresse
Plain
p
=
do
docs
<-
join
$
mapM
(
toDoc
RIS
)
<$>
snd
<$>
enrichWith
RisPresse
<$>
readFileWith
RIS
p
pure
$
Right
docs
parseFile
WOS
Plain
p
=
do
docs
<-
join
$
mapM
(
toDoc
WOS
)
<$>
snd
<$>
enrichWith
WOS
<$>
readFileWith
WOS
p
pure
$
Right
docs
parseFile
ff
_
p
=
do
docs
<-
join
$
mapM
(
toDoc
ff
)
<$>
snd
<$>
enrichWith
ff
<$>
readFileWith
ff
p
pure
$
Right
docs
...
...
@@ -184,19 +187,19 @@ toDoc ff d = do
-- let abstract = lookup "abstract" d
let
lang
=
EN
-- maybe EN identity (join $ detectLangDefault <$> (fmap (DT.take 50) abstract))
let
dateToParse
=
DT
.
replace
"
-"
" "
<$>
lookup
"PY"
d
<>
Just
" "
<>
lookup
"publication_date"
d
let
dateToParse
=
DT
.
replace
"
"
""
<$>
lookup
"PY"
d
-- <> Just " " <> lookup "publication_date" d
printDebug
"[G.C.T.C.Parsers] dateToParse"
dateToParse
(
utcTime
,
(
pub_year
,
pub_month
,
pub_day
))
<-
Date
.
dateSplit
lang
dateToParse
pure
HyperdataDocument
{
_hd_bdd
=
Just
$
DT
.
pack
$
show
ff
let
hd
=
HyperdataDocument
{
_hd_bdd
=
Just
$
DT
.
pack
$
show
ff
,
_hd_doi
=
lookup
"doi"
d
,
_hd_url
=
lookup
"URL"
d
,
_hd_uniqId
=
Nothing
,
_hd_uniqIdBdd
=
Nothing
,
_hd_page
=
Nothing
,
_hd_title
=
lookup
"title"
d
,
_hd_authors
=
Nothing
,
_hd_institutes
=
lookup
"
author
s"
d
,
_hd_authors
=
lookup
"authors"
d
,
_hd_institutes
=
lookup
"
institute
s"
d
,
_hd_source
=
lookup
"source"
d
,
_hd_abstract
=
lookup
"abstract"
d
,
_hd_publication_date
=
fmap
(
DT
.
pack
.
show
)
utcTime
...
...
@@ -207,6 +210,8 @@ toDoc ff d = do
,
_hd_publication_minute
=
Nothing
,
_hd_publication_second
=
Nothing
,
_hd_language_iso2
=
Just
$
(
DT
.
pack
.
show
)
lang
}
printDebug
"[G.C.T.C.Parsers] HyperdataDocument"
hd
pure
hd
enrichWith
::
FileType
->
(
a
,
[[[(
DB
.
ByteString
,
DB
.
ByteString
)]]])
->
(
a
,
[[(
Text
,
Text
)]])
...
...
@@ -267,3 +272,10 @@ clean txt = DBC.map clean' txt
clean'
'
\t
'
=
' '
clean'
';'
=
'.'
clean'
c
=
c
--
splitOn
::
NgramsType
->
Maybe
Text
->
Text
->
[
Text
]
splitOn
Authors
(
Just
"WOS"
)
=
(
DT
.
splitOn
"; "
)
splitOn
_
_
=
(
DT
.
splitOn
", "
)
src/Gargantext/Core/Text/Corpus/Parsers/RIS.hs
View file @
ca17a524
...
...
@@ -23,7 +23,7 @@ import Data.List (lookup)
import
Control.Applicative
import
Data.Attoparsec.ByteString
(
Parser
,
try
,
takeTill
,
take
,
many1
)
import
Data.Attoparsec.ByteString.Char8
(
isEndOfLine
)
import
Data.ByteString
(
ByteString
,
concat
)
import
Data.ByteString
(
ByteString
,
intercalate
)
import
Gargantext.Prelude
hiding
(
takeWhile
,
take
)
import
qualified
Data.List
as
DL
-------------------------------------------------------------
...
...
@@ -55,7 +55,7 @@ fieldWith n = do
let
txts'
=
case
DL
.
length
txts
>
0
of
True
->
txts
False
->
[]
pure
(
name
,
concat
([
txt
]
<>
txts'
))
pure
(
name
,
intercalate
";"
([
txt
]
<>
txts'
))
lines
::
Parser
[
ByteString
]
...
...
@@ -70,5 +70,3 @@ onField :: ByteString -> (ByteString -> [(ByteString, ByteString)])
->
[(
ByteString
,
ByteString
)]
->
[(
ByteString
,
ByteString
)]
onField
k
f
m
=
m
<>
(
maybe
[]
f
(
lookup
k
m
)
)
src/Gargantext/Core/Text/Corpus/Parsers/WOS.hs
View file @
ca17a524
...
...
@@ -52,6 +52,7 @@ keys field
|
field
==
"TI"
=
"title"
|
field
==
"SO"
=
"source"
|
field
==
"DI"
=
"doi"
|
field
==
"PY"
=
"publication_date"
|
field
==
"PD"
=
"publication_date"
|
field
==
"SP"
=
"institutes"
|
field
==
"AB"
=
"abstract"
|
otherwise
=
field
src/Gargantext/Core/Text/List/Management.sh
0 → 100644
View file @
ca17a524
{
-|
Module : Gargantext.Core.Text.Ngrams.List.Management
Description : Tools to manage lists
Copyright :
(
c
)
CNRS, 2017-Present
License : AGPL + CECILL v3
Maintainer : team@gargantext.org
Stability : experimental
Portability : POSIX
-
}
{
-# LANGUAGE ScopedTypeVariables
#-}
{
-# LANGUAGE TemplateHaskell
#-}
module Gargantext.Core.Text.List.Management
where
{
-
import Data.HashMap.Strict
(
HashMap
)
import Data.Map
(
Map
)
import Gargantext.API.Ngrams
import Gargantext.API.Ngrams.Types
(
NgramsElement, NgramsTerm
(
..
))
import Gargantext.Database.Action.Flow.Types
import Gargantext.API.Ngrams.Tools
(
getListNgrams
)
import Gargantext.Core.NodeStory
import Gargantext.Core.Text
(
size
)
import Gargantext.Core.Text.List.Group
import Gargantext.Core.Text.List.Group.Prelude
import Gargantext.Core.Text.List.Group.WithStem
import Gargantext.Core.Text.List.Social
import Gargantext.Core.Text.List.Social.Prelude
import Gargantext.Core.Text.Metrics
(
scored
', Scored(..), scored_speExc, scored_genInc, normalizeGlobal, normalizeLocal, scored_terms)
import Gargantext.Core.Types (ListType(..), CorpusId, ListId)
import Gargantext.Core.Types.Individu (User(..))
import Gargantext.Database.Action.Metrics.NgramsByContext (getContextsByNgramsUser, getContextsByNgramsOnlyUser)
import Gargantext.Database.Action.Metrics.TFICF (getTficf_withSample)
import Gargantext.Database.Admin.Types.Node (NodeId)
import Gargantext.Database.Prelude (CmdM)
import Gargantext.Database.Query.Table.Ngrams (text2ngrams)
import Gargantext.Database.Query.Table.NgramsPostag (selectLems)
import Gargantext.Database.Query.Table.Node (defaultList, getClosestParentIdByType)
import Gargantext.Database.Query.Table.Node.Error (HasNodeError())
import Gargantext.Database.Query.Tree.Error (HasTreeError)
import Gargantext.Database.Action.Metrics.NgramsByContext (getOccByNgramsOnlyFast'
)
import Gargantext.Database.Schema.Ngrams
(
NgramsType
(
..
)
, Ngrams
(
..
))
import Gargantext.Prelude
import qualified Data.HashMap.Strict as HashMap
import qualified Data.HashSet as HashSet
import qualified Data.List as List
import qualified Data.Map as Map
import qualified Data.Set as Set
import qualified Gargantext.Data.HashMap.Strict.Utils as HashMap
restrictListSize
:: forall
env
err m.
(
HasNodeStory
env
err m, FlowCmdM
env
err m
)
=>
CorpusId
-> ListId
-> NgramsType
-> ListType
-> Int
--
^ number of ngram pairs to keep
-> m
()
restrictListSize corpusId listId ngramsType listType size
=
do
ngrams <- getListNgrams
[
listId] ngramsType
--
corpus_id <- getClosestParentIdByType
occurrences <- getOccByNgramsOnlyFast
' corpusId
listId
ngramsType
(HashMap.keys ngrams)
ngrams'
<- filterWith listType size occurrences ngrams
_ <- setListNgrams listId ngramsType ngrams
'
return ()
where filterWith :: ListType -> Int -> HashMap NgramsTerm Int
-> HashMap NgramsTerm NgramsRepoElement
-> m (Map NgramsTerm NgramsRepoElement)
filterWith listType'
size occs ngrams
=
HashMap.filter with ngrams
where
with nre
=
case
(&&
)
<
$>
Just
(
nre^.nre_list
==
listType
)
<
*
>
(
HashMap.lookup
(
nre^.nre_root
)
occs
&&
-
}
src/Gargantext/Database/Action/Flow.hs
View file @
ca17a524
...
...
@@ -75,11 +75,11 @@ import qualified Data.Conduit as C
import
Gargantext.API.Admin.Orchestrator.Types
(
JobLog
(
..
))
import
Gargantext.Core
(
Lang
(
..
),
PosTagAlgo
(
..
))
import
Gargantext.Core.Ext.IMT
(
toSchoolName
)
--
import Gargantext.Core.Ext.IMT (toSchoolName)
import
Gargantext.Core.Ext.IMTUser
(
readFile_Annuaire
)
import
Gargantext.Core.Flow.Types
import
Gargantext.Core.Text
import
Gargantext.Core.Text.Corpus.Parsers
(
parseFile
,
FileFormat
,
FileType
)
import
Gargantext.Core.Text.Corpus.Parsers
(
parseFile
,
FileFormat
,
FileType
,
splitOn
)
import
Gargantext.Core.Text.List
(
buildNgramsLists
)
import
Gargantext.Core.Text.List.Group.WithStem
(
{-StopSize(..),-}
GroupParams
(
..
))
import
Gargantext.Core.Text.List.Social
(
FlowSocialListWith
(
..
))
...
...
@@ -550,13 +550,14 @@ instance ExtractNgramsT HyperdataDocument
$
_hd_source
doc
institutes
=
map
text2ngrams
$
maybe
[
"Nothing"
]
(
map
toSchoolName
.
(
T
.
splitOn
", "
))
$
maybe
[
"Nothing"
]
(
splitOn
Institutes
(
doc
^.
hd_bdd
))
$
_hd_institutes
doc
authors
=
map
text2ngrams
$
maybe
[
"Nothing"
]
(
T
.
splitOn
", "
)
$
maybe
[
"Nothing"
]
(
splitOn
Authors
(
doc
^.
hd_bdd
)
)
$
_hd_authors
doc
termsWithCounts'
<-
map
(
\
(
t
,
cnt
)
->
(
enrichedTerms
(
lang'
^.
tt_lang
)
CoreNLP
NP
t
,
cnt
))
<$>
concat
<$>
liftBase
(
extractTerms
lang'
$
hasText
doc
)
...
...
src/Gargantext/Database/Action/Flow/Pairing.hs
View file @
ca17a524
...
...
@@ -17,7 +17,7 @@ module Gargantext.Database.Action.Flow.Pairing
where
import
Debug.Trace
(
trace
)
import
Control.Lens
(
_Just
,
(
^.
))
import
Control.Lens
(
_Just
,
(
^.
)
,
view
)
import
Data.Hashable
(
Hashable
)
import
Data.HashMap.Strict
(
HashMap
)
import
Data.Maybe
(
fromMaybe
,
catMaybes
)
...
...
@@ -35,7 +35,7 @@ import Gargantext.Database.Action.Metrics.NgramsByContext (getContextsByNgramsOn
import
Gargantext.Database.Admin.Config
import
Gargantext.Database.Admin.Types.Hyperdata
-- (HyperdataContact(..))
import
Gargantext.Database.Admin.Types.Node
-- (AnnuaireId, CorpusId, ListId, DocId, ContactId, NodeId)
import
Gargantext.Database.Query.Prelude
(
leftJoin2
,
returnA
,
queryNodeNodeTable
)
import
Gargantext.Database.Query.Prelude
(
returnA
,
queryNodeNodeTable
)
import
Gargantext.Database.Query.Table.Node
(
defaultList
)
import
Gargantext.Database.Query.Table.Node.Children
(
getAllContacts
)
import
Gargantext.Database.Query.Table.Node.Select
(
selectNodesWithUsername
)
...
...
@@ -60,16 +60,13 @@ isPairedWith nId nt = runOpaQuery (selectQuery nt nId)
where
selectQuery
::
NodeType
->
NodeId
->
Select
(
Column
SqlInt4
)
selectQuery
nt'
nId'
=
proc
()
->
do
(
node
,
node_node
)
<-
queryJoin
-<
()
restrict
-<
(
node
^.
node_typename
)
.==
(
sqlInt4
$
toDBid
nt'
)
restrict
-<
(
node_node
^.
nn_node1_id
)
.==
(
toNullable
$
pgNodeId
nId'
)
node
<-
queryNodeTable
-<
()
node_node
<-
optionalRestrict
queryNodeNodeTable
-<
\
node_node'
->
(
node
^.
node_id
)
.==
(
node_node'
^.
nn_node2_id
)
restrict
-<
(
node
^.
node_typename
)
.==
sqlInt4
(
toDBid
nt'
)
restrict
-<
(
view
nn_node1_id
<$>
node_node
)
.===
justFields
(
pgNodeId
nId'
)
returnA
-<
node
^.
node_id
queryJoin
::
Select
(
NodeRead
,
NodeNodeReadNull
)
queryJoin
=
leftJoin2
queryNodeTable
queryNodeNodeTable
cond
where
cond
(
node
,
node_node
)
=
node
^.
node_id
.==
node_node
^.
nn_node2_id
-----------------------------------------------------------------------
pairing
::
AnnuaireId
->
CorpusId
->
Maybe
ListId
->
GargNoServer
[
Int
]
pairing
a
c
l'
=
do
...
...
@@ -85,7 +82,7 @@ dataPairing :: AnnuaireId
->
(
CorpusId
,
ListId
,
NgramsType
)
->
GargNoServer
(
HashMap
ContactId
(
Set
DocId
))
dataPairing
aId
(
cId
,
lId
,
ngt
)
=
do
-- mc :: HM.HashMap ContactName (Set ContactId)
-- mc :: HM.HashMap ContactName (Set ContactId)
mc
<-
getNgramsContactId
aId
-- md :: HM.HashMap DocAuthor (Set DocId)
md
<-
getNgramsDocId
cId
lId
ngt
...
...
src/Gargantext/Database/Action/Search.hs
View file @
ca17a524
...
...
@@ -9,11 +9,12 @@ Portability : POSIX
-}
{-# LANGUAGE Arrows #-}
{-# LANGUAGE LambdaCase #-}
module
Gargantext.Database.Action.Search
where
import
Control.Arrow
(
returnA
)
import
Control.Lens
((
^.
))
import
Control.Lens
((
^.
)
,
view
)
import
qualified
Data.List
as
List
import
qualified
Data.Map.Strict
as
Map
import
Data.Maybe
...
...
@@ -157,28 +158,26 @@ queryInCorpus :: HasDBid NodeType
->
Text
->
O
.
Select
FacetDocRead
queryInCorpus
cId
t
q
=
proc
()
->
do
(
c
,
nc
)
<-
joinInCorpus
-<
()
restrict
-<
(
nc
^.
nc_node_id
)
.==
(
toNullable
$
pgNodeId
cId
)
c
<-
queryContextSearchTable
-<
()
nc
<-
optionalRestrict
queryNodeContextTable
-<
\
nc'
->
(
nc'
^.
nc_context_id
)
.==
_cs_id
c
restrict
-<
(
view
nc_node_id
<$>
nc
)
.===
justFields
(
pgNodeId
cId
)
restrict
-<
if
t
then
(
nc
^.
nc_category
)
.==
(
toNullable
$
sqlInt4
0
)
else
(
nc
^.
nc_category
)
.>=
(
toNullable
$
sqlInt4
1
)
restrict
-<
(
c
^.
cs_search
)
@@
(
sqlTSQuery
(
unpack
q
))
restrict
-<
(
c
^.
cs_typename
)
.==
(
sqlInt4
$
toDBid
NodeDocument
)
then
(
view
nc_category
<$>
nc
)
.===
justFields
(
sqlInt4
0
)
else
matchMaybe
(
view
nc_category
<$>
nc
)
$
\
case
Nothing
->
toFields
False
Just
c'
->
c'
.>=
sqlInt4
1
restrict
-<
(
c
^.
cs_search
)
@@
sqlTSQuery
(
unpack
q
)
restrict
-<
(
c
^.
cs_typename
)
.==
sqlInt4
(
toDBid
NodeDocument
)
returnA
-<
FacetDoc
{
facetDoc_id
=
c
^.
cs_id
,
facetDoc_created
=
c
^.
cs_date
,
facetDoc_title
=
c
^.
cs_name
,
facetDoc_hyperdata
=
c
^.
cs_hyperdata
,
facetDoc_category
=
nc
^.
nc_category
,
facetDoc_ngramCount
=
nc
^.
nc_score
,
facetDoc_score
=
nc
^.
nc_score
,
facetDoc_category
=
maybeFieldsToNullable
(
view
nc_category
<$>
nc
)
,
facetDoc_ngramCount
=
maybeFieldsToNullable
(
view
nc_score
<$>
nc
)
,
facetDoc_score
=
maybeFieldsToNullable
(
view
nc_score
<$>
nc
)
}
joinInCorpus
::
O
.
Select
(
ContextSearchRead
,
NodeContextReadNull
)
joinInCorpus
=
leftJoin
queryContextSearchTable
queryNodeContextTable
cond
where
cond
::
(
ContextSearchRead
,
NodeContextRead
)
->
Column
SqlBool
cond
(
c
,
nc
)
=
nc
^.
nc_context_id
.==
_cs_id
c
------------------------------------------------------------------------
searchInCorpusWithContacts
::
HasDBid
NodeType
...
...
@@ -201,7 +200,7 @@ selectGroup :: HasDBid NodeType
=>
CorpusId
->
AnnuaireId
->
Text
->
Select
FacetPairedRead
Null
->
Select
FacetPairedRead
selectGroup
cId
aId
q
=
proc
()
->
do
(
a
,
b
,
c
,
d
)
<-
aggregate
(
p4
(
groupBy
,
groupBy
,
groupBy
,
O
.
sum
))
(
selectContactViaDoc
cId
aId
q
)
-<
()
...
...
@@ -214,25 +213,46 @@ selectContactViaDoc
->
AnnuaireId
->
Text
->
SelectArr
()
(
Column
(
Nullable
SqlInt4
)
,
Column
(
Nullable
SqlTimestamptz
)
,
Column
(
Nullable
SqlJsonb
)
,
Column
(
Nullable
SqlInt4
)
(
Field
SqlInt4
,
Field
SqlTimestamptz
,
Field
SqlJsonb
,
Field
SqlInt4
)
selectContactViaDoc
cId
aId
query
=
proc
()
->
do
(
doc
,
(
corpus
,
(
_nodeContext_nodeContext
,
(
annuaire
,
contact
))))
<-
queryContactViaDoc
-<
()
restrict
-<
(
doc
^.
cs_search
)
@@
(
sqlTSQuery
$
unpack
query
)
restrict
-<
(
doc
^.
cs_typename
)
.==
(
sqlInt4
$
toDBid
NodeDocument
)
restrict
-<
(
corpus
^.
nc_node_id
)
.==
(
toNullable
$
pgNodeId
cId
)
restrict
-<
(
annuaire
^.
nc_node_id
)
.==
(
toNullable
$
pgNodeId
aId
)
restrict
-<
(
contact
^.
context_typename
)
.==
(
toNullable
$
sqlInt4
$
toDBid
NodeContact
)
returnA
-<
(
contact
^.
context_id
,
contact
^.
context_date
,
contact
^.
context_hyperdata
,
toNullable
$
sqlInt4
1
--(doc, (corpus, (_nodeContext_nodeContext, (annuaire, contact)))) <- queryContactViaDoc -< ()
(
contact
,
annuaire
,
_
,
corpus
,
doc
)
<-
queryContactViaDoc
-<
()
restrict
-<
matchMaybe
(
view
cs_search
<$>
doc
)
$
\
case
Nothing
->
toFields
False
Just
s
->
s
@@
sqlTSQuery
(
unpack
query
)
restrict
-<
(
view
cs_typename
<$>
doc
)
.===
justFields
(
sqlInt4
(
toDBid
NodeDocument
))
restrict
-<
(
view
nc_node_id
<$>
corpus
)
.===
justFields
(
pgNodeId
cId
)
restrict
-<
(
view
nc_node_id
<$>
annuaire
)
.===
justFields
(
pgNodeId
aId
)
restrict
-<
(
contact
^.
context_typename
)
.==
sqlInt4
(
toDBid
NodeContact
)
returnA
-<
(
contact
^.
context_id
,
contact
^.
context_date
,
contact
^.
context_hyperdata
,
sqlInt4
1
)
queryContactViaDoc
::
O
.
Select
(
ContextSearchRead
queryContactViaDoc
::
O
.
Select
(
ContextRead
,
MaybeFields
NodeContextRead
,
MaybeFields
NodeContext_NodeContextRead
,
MaybeFields
NodeContextRead
,
MaybeFields
ContextSearchRead
)
queryContactViaDoc
=
proc
()
->
do
contact
<-
queryContextTable
-<
()
annuaire
<-
optionalRestrict
queryNodeContextTable
-<
\
annuaire'
->
(
annuaire'
^.
nc_context_id
)
.==
(
contact
^.
context_id
)
nodeContext_nodeContext
<-
optionalRestrict
queryNodeContext_NodeContextTable
-<
\
ncnc'
->
justFields
(
ncnc'
^.
ncnc_nodecontext2
)
.===
(
view
nc_id
<$>
annuaire
)
corpus
<-
optionalRestrict
queryNodeContextTable
-<
\
corpus'
->
justFields
(
corpus'
^.
nc_id
)
.===
(
view
ncnc_nodecontext1
<$>
nodeContext_nodeContext
)
doc
<-
optionalRestrict
queryContextSearchTable
-<
\
doc'
->
justFields
(
doc'
^.
cs_id
)
.===
(
view
nc_context_id
<$>
corpus
)
returnA
-<
(
contact
,
annuaire
,
nodeContext_nodeContext
,
corpus
,
doc
)
queryContactViaDoc'
::
O
.
Select
(
ContextSearchRead
,
(
NodeContextReadNull
,
(
NodeContext_NodeContextReadNull
,
(
NodeContextReadNull
...
...
@@ -241,7 +261,7 @@ queryContactViaDoc :: O.Select ( ContextSearchRead
)
)
)
queryContactViaDoc
=
queryContactViaDoc
'
=
leftJoin5
queryContextTable
queryNodeContextTable
...
...
src/Gargantext/Database/Action/Share.hs
View file @
ca17a524
{-|
Module : Gargantext.Database.Action.Share
Description :
Description :
Copyright : (c) CNRS, 2017-Present
License : AGPL + CECILL v3
Maintainer : team@gargantext.org
...
...
@@ -17,6 +17,7 @@ module Gargantext.Database.Action.Share
import
Control.Arrow
(
returnA
)
import
Control.Lens
(
view
,
(
^.
))
import
Data.Maybe
(
catMaybes
)
import
Data.Text
(
Text
)
import
Gargantext.Core.Types.Individu
(
User
(
..
))
import
Gargantext.Database
...
...
@@ -24,7 +25,7 @@ import Gargantext.Database.Action.User (getUserId)
import
Gargantext.Database.Admin.Config
(
hasNodeType
,
isInNodeTypes
)
import
Gargantext.Database.Admin.Types.Hyperdata
(
HyperdataAny
(
..
))
import
Gargantext.Database.Admin.Types.Node
import
Gargantext.Database.Query.Join
(
leftJoin3'
)
--
import Gargantext.Database.Query.Join (leftJoin3')
import
Gargantext.Database.Query.Table.Node
(
getNode
,
getNodesWith
)
import
Gargantext.Database.Query.Table.Node.Error
(
HasNodeError
,
errorWith
)
import
Gargantext.Database.Query.Table.NodeNode
(
deleteNodeNode
,
queryNodeNodeTable
)
...
...
@@ -32,6 +33,7 @@ import Gargantext.Database.Query.Table.User
import
Gargantext.Database.Query.Tree.Root
(
getRootId
)
import
Gargantext.Database.Schema.Node
import
Gargantext.Prelude
import
Gargantext.Utils.Tuple
(
uncurryMaybe
)
import
Opaleye
hiding
(
not
)
import
qualified
Opaleye
as
O
...
...
@@ -60,28 +62,43 @@ type TeamNodeId = NodeId
-- used for the membership
membersOf
::
HasNodeError
err
=>
TeamNodeId
->
Cmd
err
[(
Text
,
SharedFolderId
)]
membersOf
nId
=
runOpaQuery
(
membersOfQuery
nId
)
membersOf
nId
=
do
res
<-
runOpaQuery
$
membersOfQuery
nId
pure
$
catMaybes
(
uncurryMaybe
<$>
res
)
membersOfQuery
::
TeamNodeId
->
SelectArr
()
(
Column
(
Nullable
SqlText
),
Column
(
Nullable
SqlInt4
))
->
SelectArr
()
(
MaybeFields
(
Field
SqlText
),
MaybeFields
(
Field
SqlInt4
))
membersOfQuery
(
NodeId
teamId
)
=
proc
()
->
do
(
nn
,
(
n
,
u
))
<-
nodeNode_node_User
-<
()
restrict
-<
nn
^.
nn_node2_id
.==
sqlInt4
teamId
returnA
-<
(
user_username
u
,
n
^.
node_id
)
nodeNode_node_User
::
O
.
Select
(
NodeNodeRead
,
(
NodeReadNull
,
UserReadNull
))
nodeNode_node_User
=
leftJoin3'
queryNodeNodeTable
queryNodeTable
queryUserTable
cond12
cond23
where
cond12
::
(
NodeNodeRead
,
(
NodeRead
,
UserReadNull
))
->
Column
SqlBool
cond12
(
nn
,
(
n
,
_u
))
=
(
nn
^.
nn_node1_id
.==
n
^.
node_id
)
cond23
::
(
NodeRead
,
UserRead
)
->
Column
SqlBool
cond23
(
n
,
u
)
=
(
n
^.
node_user_id
.==
user_id
u
)
(
nn
,
n
,
u
)
<-
nodeNode_node_User
-<
()
restrict
-<
(
nn
^.
nn_node2_id
)
.==
sqlInt4
teamId
returnA
-<
(
user_username
<$>
u
,
view
node_id
<$>
n
)
nodeNode_node_User
::
O
.
Select
(
NodeNodeRead
,
MaybeFields
NodeRead
,
MaybeFields
UserRead
)
nodeNode_node_User
=
proc
()
->
do
nn
<-
queryNodeNodeTable
-<
()
n
<-
optionalRestrict
queryNodeTable
-<
\
n'
->
(
n'
^.
node_id
)
.==
(
nn
^.
nn_node1_id
)
u
<-
optionalRestrict
queryUserTable
-<
\
u'
->
(
view
node_user_id
<$>
n
)
.===
justFields
(
user_id
u'
)
returnA
-<
(
nn
,
n
,
u
)
-- nodeNode_node_User' :: O.Select (NodeNodeRead, (NodeReadNull, UserReadNull))
-- nodeNode_node_User' = leftJoin3' queryNodeNodeTable
-- queryNodeTable
-- queryUserTable
-- cond12
-- cond23
-- where
-- cond12 :: (NodeNodeRead, (NodeRead, UserReadNull)) -> Column SqlBool
-- cond12 (nn, (n, _u)) = (nn^.nn_node1_id .== n^.node_id)
-- cond23 :: (NodeRead, UserRead) -> Column SqlBool
-- cond23 (n, u) = (n^.node_user_id .== user_id u)
...
...
@@ -144,4 +161,3 @@ unPublish :: HasNodeError err
=>
ParentId
->
NodeId
->
Cmd
err
Int
unPublish
p
n
=
deleteNodeNode
p
n
src/Gargantext/Database/Prelude.hs
View file @
ca17a524
...
...
@@ -9,7 +9,9 @@ Portability : POSIX
-}
{-# LANGUAGE Arrows #-}
{-# LANGUAGE ConstraintKinds, ScopedTypeVariables #-}
{-# LANGUAGE LambdaCase #-}
module
Gargantext.Database.Prelude
where
...
...
@@ -33,18 +35,17 @@ import Database.PostgreSQL.Simple.Internal (Field)
import
Database.PostgreSQL.Simple.Types
(
Query
(
..
))
import
Gargantext.Core.Mail.Types
(
HasMail
)
import
Gargantext.Prelude
import
Gargantext.Prelude.Config
(
readIniFile'
,
val
)
import
Opaleye
(
Unpackspec
,
showSql
,
FromFields
,
Select
,
runSelect
,
SqlJsonb
,
DefaultFromField
)
import
Gargantext.Prelude.Config
(
GargConfig
(),
readIniFile'
,
val
)
import
Opaleye
(
Unpackspec
,
showSql
,
FromFields
,
Select
,
runSelect
,
SqlJsonb
,
DefaultFromField
,
toFields
,
matchMaybe
,
MaybeFields
)
import
Opaleye.Aggregate
(
countRows
)
import
System.IO
(
FilePath
)
import
System.IO
(
stderr
)
import
qualified
Opaleye.Internal.Constant
import
qualified
Opaleye.Internal.Operators
import
System.IO
(
FilePath
,
stderr
)
import
Text.Read
(
readMaybe
)
import
qualified
Data.ByteString
as
DB
import
qualified
Data.List
as
DL
import
qualified
Database.PostgreSQL.Simple
as
PGS
import
Gargantext.Prelude.Config
(
GargConfig
())
-------------------------------------------------------
class
HasConnectionPool
env
where
connPool
::
Getter
env
(
Pool
Connection
)
...
...
@@ -215,3 +216,10 @@ dbCheck = do
case
r
of
[]
->
return
False
_
->
return
True
restrictMaybe
::
(
Default
Opaleye
.
Internal
.
Operators
.
IfPP
b
b
,
(
Default
Opaleye
.
Internal
.
Constant
.
ToFields
Bool
b
))
=>
MaybeFields
a
->
(
a
->
b
)
->
b
restrictMaybe
v
cond
=
matchMaybe
v
$
\
case
Nothing
->
toFields
True
Just
v'
->
cond
v'
src/Gargantext/Database/Query/Facet.hs
View file @
ca17a524
...
...
@@ -48,7 +48,6 @@ import qualified Opaleye.Internal.Unpackspec()
import
Gargantext.Core
import
Gargantext.Core.Types
import
Gargantext.Database.Query.Filter
import
Gargantext.Database.Query.Join
(
leftJoin5
)
import
Gargantext.Database.Query.Table.Ngrams
import
Gargantext.Database.Query.Table.Context
import
Gargantext.Database.Query.Facet.Types
...
...
@@ -82,35 +81,53 @@ viewAuthorsDoc :: HasDBid NodeType
->
NodeType
->
Select
FacetDocRead
viewAuthorsDoc
cId
_
nt
=
proc
()
->
do
(
doc
,(
_
,(
_
,(
_
,
contact'
))))
<-
queryAuthorsDoc
-<
()
--(doc,(_,(_,(_,contact')))) <- queryAuthorsDoc -< ()
(
doc
,
_
,
_
,
_
,
contact'
)
<-
queryAuthorsDoc
-<
()
restrict
-<
_node_id
contact'
.==
(
toNullable
$
pgNodeId
cId
)
restrict
-<
_node_typename
doc
.==
(
sqlInt4
$
toDBid
nt
)
restrict
-<
fromMaybeFields
(
sqlInt4
$
-
1
)
(
_node_id
<$>
contact'
)
.===
pgNodeId
cId
restrict
-<
_node_typename
doc
.==
sqlInt4
(
toDBid
nt
)
returnA
-<
FacetDoc
{
facetDoc_id
=
_node_id
doc
,
facetDoc_created
=
_node_date
doc
,
facetDoc_title
=
_node_name
doc
,
facetDoc_hyperdata
=
_node_hyperdata
doc
,
facetDoc_category
=
toNullable
$
sqlInt4
1
,
facetDoc_ngramCount
=
toNullable
$
sqlDouble
1
,
facetDoc_ngramCount
=
toNullable
$
sqlDouble
1
.0
,
facetDoc_score
=
toNullable
$
sqlDouble
1
}
queryAuthorsDoc
::
Select
(
NodeRead
,
(
ContextNodeNgramsReadNull
,
(
NgramsReadNull
,
(
ContextNodeNgramsReadNull
,
NodeReadNull
))))
queryAuthorsDoc
=
leftJoin5
queryNodeTable
queryContextNodeNgramsTable
queryNgramsTable
queryContextNodeNgramsTable
queryNodeTable
cond12
cond23
cond34
cond45
where
cond12
::
(
ContextNodeNgramsRead
,
NodeRead
)
->
Column
SqlBool
cond12
(
nodeNgram
,
doc
)
=
_node_id
doc
.==
_cnng_context_id
nodeNgram
cond23
::
(
NgramsRead
,
(
ContextNodeNgramsRead
,
NodeReadNull
))
->
Column
SqlBool
cond23
(
ngrams'
,
(
nodeNgram
,
_
))
=
ngrams'
^.
ngrams_id
.==
_cnng_ngrams_id
nodeNgram
cond34
::
(
ContextNodeNgramsRead
,
(
NgramsRead
,
(
ContextNodeNgramsReadNull
,
NodeReadNull
)))
->
Column
SqlBool
cond34
(
nodeNgram2
,
(
ngrams'
,
(
_
,
_
)))
=
ngrams'
^.
ngrams_id
.==
_cnng_ngrams_id
nodeNgram2
cond45
::
(
NodeRead
,
(
ContextNodeNgramsRead
,
(
NgramsReadNull
,
(
ContextNodeNgramsReadNull
,
NodeReadNull
))))
->
Column
SqlBool
cond45
(
contact'
,
(
nodeNgram2'
,
(
_
,
(
_
,
_
))))
=
_node_id
contact'
.==
_cnng_context_id
nodeNgram2'
--queryAuthorsDoc :: Select (NodeRead, (ContextNodeNgramsReadNull, (NgramsReadNull, (ContextNodeNgramsRead, NodeReadNull))))
--queryAuthorsDoc = leftJoin5 queryNodeTable queryContextNodeNgramsTable queryNgramsTable queryContextNodeNgramsTable queryNodeTable cond12 cond23 cond34 cond45
queryAuthorsDoc
::
Select
(
NodeRead
,
MaybeFields
ContextNodeNgramsRead
,
MaybeFields
NgramsRead
,
MaybeFields
ContextNodeNgramsRead
,
MaybeFields
NodeRead
)
queryAuthorsDoc
=
proc
()
->
do
n
<-
queryNodeTable
-<
()
cnn
<-
optionalRestrict
queryContextNodeNgramsTable
-<
\
cnn'
->
_node_id
n
.==
_cnng_context_id
cnn'
ng
<-
optionalRestrict
queryNgramsTable
-<
\
ng'
->
justFields
(
ng'
^.
ngrams_id
)
.===
(
_cnng_ngrams_id
<$>
cnn
)
cnn2
<-
optionalRestrict
queryContextNodeNgramsTable
-<
\
cnn2'
->
(
_ngrams_id
<$>
ng
)
.===
justFields
(
_cnng_ngrams_id
cnn2'
)
contact
<-
optionalRestrict
queryNodeTable
-<
\
contact'
->
justFields
(
_node_id
contact'
)
.===
(
_cnng_context_id
<$>
cnn2
)
returnA
-<
(
n
,
cnn
,
ng
,
cnn2
,
contact
)
-- where
-- cond12 :: (ContextNodeNgramsRead, NodeRead) -> Field SqlBool
-- cond12 (nodeNgram, doc) = _node_id doc
-- .== _cnng_context_id nodeNgram
-- cond23 :: (NgramsRead, (ContextNodeNgramsRead, NodeReadNull)) -> Field SqlBool
-- cond23 (ngrams', (nodeNgram, _)) = ngrams'^.ngrams_id
-- .== _cnng_ngrams_id nodeNgram
-- cond34 :: (ContextNodeNgramsRead, (NgramsRead, (ContextNodeNgramsReadNull, NodeReadNull))) -> Field SqlBool
-- cond34 (nodeNgram2, (ngrams', (_,_)))= ngrams'^.ngrams_id .== _cnng_ngrams_id nodeNgram2
-- cond45 :: (NodeRead, (ContextNodeNgramsRead, (NgramsReadNull, (ContextNodeNgramsReadNull, NodeReadNull)))) -> Field SqlBool
-- cond45 (contact', (nodeNgram2', (_, (_,_)))) = _node_id contact' .== _cnng_context_id nodeNgram2'
------------------------------------------------------------------------
...
...
@@ -144,6 +161,7 @@ viewDocuments :: CorpusId
->
Maybe
Text
->
Select
FacetDocRead
viewDocuments
cId
t
ntId
mQuery
mYear
=
viewDocumentsQuery
cId
t
ntId
mQuery
mYear
>>>
proc
(
c
,
nc
)
->
do
-- ngramCountAgg <- aggregate sumInt4 -< cnng
returnA
-<
FacetDoc
{
facetDoc_id
=
_cs_id
c
,
facetDoc_created
=
_cs_date
c
,
facetDoc_title
=
_cs_name
c
...
...
@@ -153,29 +171,38 @@ viewDocuments cId t ntId mQuery mYear = viewDocumentsQuery cId t ntId mQuery mYe
,
facetDoc_score
=
toNullable
$
nc
^.
nc_score
}
-- TODO Join with context_node_ngrams at context_id/node_id and sum by
-- doc_count.
viewDocumentsQuery
::
CorpusId
->
IsTrash
->
NodeTypeId
->
Maybe
Text
->
Maybe
Text
->
Select
(
ContextSearchRead
,
NodeContextRead
)
-- -> Select (ContextSearchRead, NodeContextRead, MaybeFields ContextNodeNgramsRead)
viewDocumentsQuery
cId
t
ntId
mQuery
mYear
=
proc
()
->
do
c
<-
queryContextSearchTable
-<
()
nc
<-
queryNodeContextTable
-<
()
restrict
-<
c
^.
cs_id
.==
nc
^.
nc_context_id
restrict
-<
nc
^.
nc_node_id
.==
(
pgNodeId
cId
)
restrict
-<
c
^.
cs_typename
.==
(
sqlInt4
ntId
)
restrict
-<
if
t
then
nc
^.
nc_category
.==
(
sqlInt4
0
)
else
nc
^.
nc_category
.>=
(
sqlInt4
1
)
-- let joinCond (nc, cnn) = do
-- restrict -< (nc ^. context_id) .== (cnn ^. context_id)
-- restrict -< (nc ^. node_id) .== (cnn ^. node_id) -- :: (NodeContextRead, ContextNodeNgramsRead) -> Field SqlBool
nc
<-
queryNodeContextTable
-<
()
restrict
-<
(
c
^.
cs_id
)
.==
(
nc
^.
nc_context_id
)
restrict
-<
nc
^.
nc_node_id
.==
pgNodeId
cId
restrict
-<
c
^.
cs_typename
.==
sqlInt4
ntId
-- cnng <- optionalRestrict queryContextNodeNgramsTable -<
-- (\cnng' -> (nc ^. nc_context_id) .== (cnng' ^. cnng_context_id) .&&
-- (nc ^. nc_node_id) .== (cnng' ^. cnng_node_id))
restrict
-<
if
t
then
nc
^.
nc_category
.==
sqlInt4
0
else
nc
^.
nc_category
.>=
sqlInt4
1
let
query
=
(
fromMaybe
""
mQuery
)
year
=
(
fromMaybe
""
mYear
)
iLikeQuery
=
T
.
intercalate
""
[
"%"
,
query
,
"%"
]
abstractLHS
h
=
fromNullable
(
sqlStrictText
""
)
$
toNullable
h
.->>
(
sqlStrictText
"abstract"
)
$
toNullable
h
.->>
sqlStrictText
"abstract"
yearLHS
h
=
fromNullable
(
sqlStrictText
""
)
$
toNullable
h
.->>
(
sqlStrictText
"publication_year"
)
$
toNullable
h
.->>
sqlStrictText
"publication_year"
restrict
-<
if
query
==
""
then
sqlBool
True
...
...
@@ -183,42 +210,43 @@ viewDocumentsQuery cId t ntId mQuery mYear = proc () -> do
.||
((
abstractLHS
(
c
^.
cs_hyperdata
))
`
ilike
`
(
sqlStrictText
iLikeQuery
))
restrict
-<
if
year
==
""
then
sqlBool
True
else
(
yearLHS
(
c
^.
cs_hyperdata
))
.==
(
sqlStrictText
year
)
else
yearLHS
(
c
^.
cs_hyperdata
)
.==
sqlStrictText
year
returnA
-<
(
c
,
nc
)
-- returnA -< (c, nc, cnng)
------------------------------------------------------------------------
filterWith
::
(
SqlOrd
date
,
SqlOrd
title
,
SqlOrd
category
,
SqlOrd
score
,
hyperdata
~
Column
SqlJsonb
)
=>
filterWith
::
(
SqlOrd
date
,
SqlOrd
title
,
SqlOrd
category
,
SqlOrd
score
,
hyperdata
~
SqlJsonb
)
=>
Maybe
Gargantext
.
Core
.
Types
.
Offset
->
Maybe
Gargantext
.
Core
.
Types
.
Limit
->
Maybe
OrderBy
->
Select
(
Facet
id
(
Column
date
)
(
Column
title
)
hyperdata
(
Column
category
)
ngramCount
(
Column
score
))
->
Select
(
Facet
id
(
Column
date
)
(
Column
title
)
hyperdata
(
Column
category
)
ngramCount
(
Column
score
))
->
Select
(
Facet
id
(
Field
date
)
(
Field
title
)
(
Field
hyperdata
)
(
FieldNullable
category
)
ngramCount
(
FieldNullable
score
))
->
Select
(
Facet
id
(
Field
date
)
(
Field
title
)
(
Field
hyperdata
)(
FieldNullable
category
)
ngramCount
(
FieldNullable
score
))
filterWith
o
l
order
q
=
limit'
l
$
offset'
o
$
orderBy
(
orderWith
order
)
q
orderWith
::
(
SqlOrd
b1
,
SqlOrd
b2
,
SqlOrd
b3
,
SqlOrd
b4
)
=>
Maybe
OrderBy
->
Order
(
Facet
id
(
Column
b1
)
(
Column
b2
)
(
Column
SqlJsonb
)
(
Column
b3
)
ngramCount
(
Column
b4
))
->
Order
(
Facet
id
(
Field
b1
)
(
Field
b2
)
(
Field
SqlJsonb
)
(
FieldNullable
b3
)
ngramCount
(
FieldNullable
b4
))
orderWith
(
Just
DateAsc
)
=
asc
facetDoc_created
orderWith
(
Just
DateDesc
)
=
desc
facetDoc_created
orderWith
(
Just
TitleAsc
)
=
asc
facetDoc_title
orderWith
(
Just
TitleDesc
)
=
desc
facetDoc_title
orderWith
(
Just
ScoreAsc
)
=
asc
facetDoc_score
orderWith
(
Just
ScoreAsc
)
=
asc
NullsLast
facetDoc_score
orderWith
(
Just
ScoreDesc
)
=
descNullsLast
facetDoc_score
orderWith
(
Just
SourceAsc
)
=
asc
facetDoc_source
orderWith
(
Just
SourceDesc
)
=
desc
facetDoc_source
orderWith
(
Just
SourceAsc
)
=
asc
NullsLast
facetDoc_source
orderWith
(
Just
SourceDesc
)
=
desc
NullsLast
facetDoc_source
orderWith
(
Just
TagAsc
)
=
asc
facetDoc_category
orderWith
(
Just
TagDesc
)
=
desc
facetDoc_category
orderWith
(
Just
TagAsc
)
=
asc
NullsLast
facetDoc_category
orderWith
(
Just
TagDesc
)
=
desc
NullsLast
facetDoc_category
orderWith
_
=
asc
facetDoc_created
facetDoc_source
::
SqlIsJson
a
=>
Facet
id
created
title
(
Column
a
)
favorite
ngramCount
score
->
Column
(
Nullable
SqlText
)
facetDoc_source
x
=
toNullable
(
facetDoc_hyperdata
x
)
.->>
sqlString
"source"
=>
Facet
id
created
title
(
Field
a
)
favorite
ngramCount
score
->
FieldNullable
SqlText
facetDoc_source
x
=
(
toNullable
$
facetDoc_hyperdata
x
)
.->>
sqlString
"source"
src/Gargantext/Database/Query/Facet/Types.hs
View file @
ca17a524
...
...
@@ -97,28 +97,28 @@ instance ( Arbitrary id
)
=>
Arbitrary
(
FacetPaired
id
date
hyperdata
score
)
where
arbitrary
=
FacetPaired
<$>
arbitrary
<*>
arbitrary
<*>
arbitrary
<*>
arbitrary
type
FacetPairedRead
=
FacetPaired
(
Column
SqlInt4
)
(
Column
SqlTimestamptz
)
(
Column
SqlJsonb
)
(
Column
SqlInt4
)
type
FacetPairedReadNull
=
FacetPaired
(
Column
(
Nullable
SqlInt4
)
)
(
Column
(
Nullable
SqlTimestamptz
)
)
(
Column
(
Nullable
SqlJsonb
)
)
(
Column
(
Nullable
SqlInt4
)
)
type
FacetPairedReadNullAgg
=
FacetPaired
(
Aggregator
(
Column
(
Nullable
SqlInt4
)
)
(
Column
(
Nullable
SqlInt4
)
)
type
FacetPairedRead
=
FacetPaired
(
Field
SqlInt4
)
(
Field
SqlTimestamptz
)
(
Field
SqlJsonb
)
(
Field
SqlInt4
)
type
FacetPairedReadNull
=
FacetPaired
(
FieldNullable
SqlInt4
)
(
FieldNullable
SqlTimestamptz
)
(
FieldNullable
SqlJsonb
)
(
FieldNullable
SqlInt4
)
type
FacetPairedReadNullAgg
=
FacetPaired
(
Aggregator
(
FieldNullable
SqlInt4
)
(
FieldNullable
SqlInt4
)
)
(
Aggregator
(
Column
(
Nullable
SqlTimestamptz
)
)
(
Column
(
Nullable
SqlTimestamptz
)
)
(
Aggregator
(
FieldNullable
SqlTimestamptz
)
(
FieldNullable
SqlTimestamptz
)
)
(
Aggregator
(
Column
(
Nullable
SqlJsonb
)
)
(
Column
(
Nullable
SqlJsonb
)
)
(
Aggregator
(
FieldNullable
SqlJsonb
)
(
FieldNullable
SqlJsonb
)
)
(
Aggregator
(
Column
(
Nullable
SqlInt4
)
)
(
Column
(
Nullable
SqlInt4
)
)
(
Aggregator
(
FieldNullable
SqlInt4
)
(
FieldNullable
SqlInt4
)
)
...
...
@@ -148,13 +148,13 @@ instance Arbitrary FacetDoc where
$
(
makeAdaptorAndInstance
"pFacetDoc"
''
F
acet
)
-- $(makeLensesWith abbreviatedFields ''Facet)
type
FacetDocRead
=
Facet
(
Column
SqlInt4
)
(
Column
SqlTimestamptz
)
(
Column
SqlText
)
(
Column
SqlJsonb
)
(
Column
(
Nullable
SqlInt4
)
)
-- Category
(
Column
(
Nullable
SqlFloat8
)
)
-- Ngrams Count
(
Column
(
Nullable
SqlFloat8
)
)
-- Score
type
FacetDocRead
=
Facet
(
Field
SqlInt4
)
(
Field
SqlTimestamptz
)
(
Field
SqlText
)
(
Field
SqlJsonb
)
(
FieldNullable
SqlInt4
)
-- Category
(
FieldNullable
SqlFloat8
)
-- Ngrams Count
(
FieldNullable
SqlFloat8
)
-- Score
-----------------------------------------------------------------------
-----------------------------------------------------------------------
...
...
src/Gargantext/Database/Query/Join.hs
View file @
ca17a524
...
...
@@ -111,17 +111,17 @@ leftJoin4 q1 q2 q3 q4
)
cond34
leftJoin5
::
(
Default
Unpackspec
b2
b2
,
Default
Unpackspec
b3
b3
,
Default
Unpackspec
b4
b4
,
Default
Unpackspec
b5
b5
,
Default
Unpackspec
b6
b6
,
Default
Unpackspec
b7
b7
,
Default
Unpackspec
fieldsL
fieldsL
,
Default
Unpackspec
b8
b8
,
Default
Unpackspec
b9
b9
,
Default
Unpackspec
b10
b10
,
Default
Unpackspec
fieldsR
fieldsR
,
Default
NullMaker
b7
b6
,
Default
NullMaker
b6
b11
,
Default
NullMaker
b8
b12
,
Default
NullMaker
b3
b13
,
Default
NullMaker
b2
b14
,
Default
NullMaker
b9
b3
,
Default
NullMaker
b10
b2
,
Default
NullMaker
b5
b9
,
Default
NullMaker
b4
b10
,
Default
NullMaker
fieldsR
b4
)
=>
leftJoin5
::
(
Default
Unpackspec
b2
b2
,
Default
Unpackspec
b3
b3
,
Default
Unpackspec
b4
b4
,
Default
Unpackspec
b5
b5
,
Default
Unpackspec
b6
b6
,
Default
Unpackspec
b7
b7
,
Default
Unpackspec
fieldsL
fieldsL
,
Default
Unpackspec
b8
b8
,
Default
Unpackspec
b9
b9
,
Default
Unpackspec
b10
b10
,
Default
Unpackspec
fieldsR
fieldsR
,
Default
NullMaker
b7
b6
,
Default
NullMaker
b6
b11
,
Default
NullMaker
b8
b12
,
Default
NullMaker
b3
b13
,
Default
NullMaker
b2
b14
,
Default
NullMaker
b9
b3
,
Default
NullMaker
b10
b2
,
Default
NullMaker
b5
b9
,
Default
NullMaker
b4
b10
,
Default
NullMaker
fieldsR
b4
)
=>
Select
fieldsR
->
Select
b5
->
Select
b7
...
...
src/Gargantext/Database/Query/Table/Node/Select.hs
View file @
ca17a524
...
...
@@ -10,6 +10,7 @@ Portability : POSIX
{-# LANGUAGE Arrows #-}
{-# LANGUAGE LambdaCase #-}
module
Gargantext.Database.Query.Table.Node.Select
where
...
...
@@ -27,16 +28,19 @@ import Gargantext.Database.Schema.User
import
Gargantext.Database.Query.Table.User
selectNodesWithUsername
::
NodeType
->
Username
->
Cmd
err
[
NodeId
]
selectNodesWithUsername
nt
u
=
runOpaQuery
(
q
u
)
where
q
u'
=
proc
()
->
do
(
n
,
usrs
)
<-
join'
-<
()
restrict
-<
user_username
usrs
.==
(
toNullable
$
sqlStrictText
u'
)
restrict
-<
_node_typename
n
.==
(
sqlInt4
$
toDBid
nt
)
returnA
-<
_node_id
n
join'
::
Select
(
NodeRead
,
UserReadNull
)
join'
=
leftJoin
queryNodeTable
queryUserTable
on1
where
on1
(
n
,
us
)
=
_node_user_id
n
.==
user_id
us
selectNodesWithUsername
nt
u
=
runOpaQuery
$
proc
()
->
do
n
<-
queryNodeTable
-<
()
usrs
<-
optionalRestrict
queryUserTable
-<
(
\
us'
->
_node_user_id
n
.==
user_id
us'
)
restrict
-<
matchMaybe
usrs
$
\
case
Nothing
->
toFields
True
Just
us
->
user_username
us
.==
sqlStrictText
u
restrict
-<
_node_typename
n
.==
sqlInt4
(
toDBid
nt
)
returnA
-<
_node_id
n
-- join' :: Select (NodeRead, UserReadNull)
-- --join' = leftJoin queryNodeTable queryUserTable on1
-- join' = optionalRestrict queryUserTable -<
-- (\(n, us) -> _node_user_id n .== user_id ud)
-- -- where
-- -- on1 (n,us) = _node_user_id n .== user_id us
src/Gargantext/Database/Query/Table/NodeContext.hs
View file @
ca17a524
...
...
@@ -15,6 +15,7 @@ commentary with @some markup@.
{-# LANGUAGE Arrows #-}
{-# LANGUAGE FunctionalDependencies #-}
{-# LANGUAGE LambdaCase #-}
{-# LANGUAGE QuasiQuotes #-}
{-# LANGUAGE TemplateHaskell #-}
...
...
@@ -75,7 +76,7 @@ _nodesContexts = runOpaQuery queryNodeContextTable
getNodeContexts
::
NodeId
->
Cmd
err
[
NodeContext
]
getNodeContexts
n
=
runOpaQuery
(
selectNodeContexts
$
pgNodeId
n
)
where
selectNodeContexts
::
Column
SqlInt4
->
Select
NodeContextRead
selectNodeContexts
::
Field
SqlInt4
->
Select
NodeContextRead
selectNodeContexts
n'
=
proc
()
->
do
ns
<-
queryNodeContextTable
-<
()
restrict
-<
_nc_node_id
ns
.==
n'
...
...
@@ -89,7 +90,7 @@ getNodeContext c n = do
Nothing
->
nodeError
(
DoesNotExist
c
)
Just
r
->
pure
r
where
selectNodeContext
::
Column
SqlInt4
->
Column
SqlInt4
->
Select
NodeContextRead
selectNodeContext
::
Field
SqlInt4
->
Field
SqlInt4
->
Select
NodeContextRead
selectNodeContext
c'
n'
=
proc
()
->
do
ns
<-
queryNodeContextTable
-<
()
restrict
-<
_nc_context_id
ns
.==
c'
...
...
@@ -211,7 +212,7 @@ nodeContextsCategory :: [(CorpusId, DocId, Int)] -> Cmd err [Int]
nodeContextsCategory
inputData
=
map
(
\
(
PGS
.
Only
a
)
->
a
)
<$>
runPGSQuery
catSelect
(
PGS
.
Only
$
Values
fields
inputData
)
where
fields
=
map
(
\
t
->
QualifiedIdentifier
Nothing
t
)
[
"int4"
,
"int4"
,
"int4"
]
fields
=
map
(
QualifiedIdentifier
Nothing
)
[
"int4"
,
"int4"
,
"int4"
]
catSelect
::
PGS
.
Query
catSelect
=
[
sql
|
UPDATE nodes_contexts as nn0
SET category = nn1.category
...
...
@@ -227,7 +228,7 @@ nodeContextsScore :: [(CorpusId, DocId, Int)] -> Cmd err [Int]
nodeContextsScore
inputData
=
map
(
\
(
PGS
.
Only
a
)
->
a
)
<$>
runPGSQuery
catScore
(
PGS
.
Only
$
Values
fields
inputData
)
where
fields
=
map
(
\
t
->
QualifiedIdentifier
Nothing
t
)
[
"int4"
,
"int4"
,
"int4"
]
fields
=
map
(
QualifiedIdentifier
Nothing
)
[
"int4"
,
"int4"
,
"int4"
]
catScore
::
PGS
.
Query
catScore
=
[
sql
|
UPDATE nodes_contexts as nn0
SET score = nn1.score
...
...
@@ -244,9 +245,9 @@ selectCountDocs cId = runCountOpaQuery (queryCountDocs cId)
where
queryCountDocs
cId'
=
proc
()
->
do
(
c
,
nc
)
<-
joinInCorpus
-<
()
restrict
-<
nc
^.
nc_node_id
.==
(
toNullable
$
pgNodeId
cId'
)
restrict
-<
nc
^.
nc_category
.>=
(
toNullable
$
sqlInt4
1
)
restrict
-<
c
^.
context_typename
.==
(
sqlInt4
$
toDBid
NodeDocument
)
restrict
-<
restrictMaybe
nc
$
\
nc'
->
(
nc'
^.
nc_node_id
)
.==
pgNodeId
cId'
.&&
(
nc'
^.
nc_category
)
.>=
sqlInt4
1
restrict
-<
(
c
^.
context_typename
)
.==
sqlInt4
(
toDBid
NodeDocument
)
returnA
-<
c
...
...
@@ -260,12 +261,12 @@ selectDocsDates cId = map (head' "selectDocsDates" . splitOn "-")
selectDocs
::
HasDBid
NodeType
=>
CorpusId
->
Cmd
err
[
HyperdataDocument
]
selectDocs
cId
=
runOpaQuery
(
queryDocs
cId
)
queryDocs
::
HasDBid
NodeType
=>
CorpusId
->
O
.
Select
(
Column
SqlJsonb
)
queryDocs
::
HasDBid
NodeType
=>
CorpusId
->
O
.
Select
(
Field
SqlJsonb
)
queryDocs
cId
=
proc
()
->
do
(
c
,
nn
)
<-
joinInCorpus
-<
()
restrict
-<
nn
^.
nc_node_id
.==
(
toNullable
$
pgNodeId
cId
)
restrict
-<
nn
^.
nc_category
.>=
(
toNullable
$
sqlInt4
1
)
restrict
-<
c
^.
context_typename
.==
(
sqlInt4
$
toDBid
NodeDocument
)
restrict
-<
restrictMaybe
nn
$
\
nn'
->
(
nn'
^.
nc_node_id
)
.==
pgNodeId
cId
.&&
(
nn'
^.
nc_category
)
.>=
sqlInt4
1
restrict
-<
(
c
^.
context_typename
)
.==
sqlInt4
(
toDBid
NodeDocument
)
returnA
-<
view
(
context_hyperdata
)
c
selectDocNodes
::
HasDBid
NodeType
=>
CorpusId
->
Cmd
err
[
Context
HyperdataDocument
]
...
...
@@ -274,23 +275,29 @@ selectDocNodes cId = runOpaQuery (queryDocNodes cId)
queryDocNodes
::
HasDBid
NodeType
=>
CorpusId
->
O
.
Select
ContextRead
queryDocNodes
cId
=
proc
()
->
do
(
c
,
nc
)
<-
joinInCorpus
-<
()
restrict
-<
nc
^.
nc_node_id
.==
(
toNullable
$
pgNodeId
cId
)
restrict
-<
nc
^.
nc_category
.>=
(
toNullable
$
sqlInt4
1
)
restrict
-<
c
^.
context_typename
.==
(
sqlInt4
$
toDBid
NodeDocument
)
-- restrict -< restrictMaybe nc $ \nc' -> (nc' ^. nc_node_id) .== pgNodeId cId .&&
-- (nc' ^. nc_category) .>= sqlInt4 1
restrict
-<
matchMaybe
nc
$
\
case
Nothing
->
toFields
True
Just
nc'
->
(
nc'
^.
nc_node_id
)
.==
pgNodeId
cId
.&&
(
nc'
^.
nc_category
)
.>=
sqlInt4
1
restrict
-<
(
c
^.
context_typename
)
.==
sqlInt4
(
toDBid
NodeDocument
)
returnA
-<
c
joinInCorpus
::
O
.
Select
(
ContextRead
,
NodeContextReadNull
)
joinInCorpus
=
leftJoin
queryContextTable
queryNodeContextTable
cond
where
cond
::
(
ContextRead
,
NodeContextRead
)
->
Column
SqlBool
cond
(
c
,
nc
)
=
c
^.
context_id
.==
nc
^.
nc_context_id
joinInCorpus
::
O
.
Select
(
ContextRead
,
MaybeFields
NodeContextRead
)
joinInCorpus
=
proc
()
->
do
c
<-
queryContextTable
-<
()
nc
<-
optionalRestrict
queryNodeContextTable
-<
(
\
nc'
->
(
c
^.
context_id
)
.==
(
nc'
^.
nc_context_id
))
returnA
-<
(
c
,
nc
)
joinOn1
::
O
.
Select
(
NodeRead
,
NodeContextReadNull
)
joinOn1
=
leftJoin
queryNodeTable
queryNodeContextTable
cond
where
cond
::
(
NodeRead
,
NodeContextRead
)
->
Column
SqlBool
cond
(
n
,
nc
)
=
nc
^.
nc_node_id
.==
n
^.
node_id
joinOn1
::
O
.
Select
(
NodeRead
,
MaybeFields
NodeContextRead
)
joinOn1
=
proc
()
->
do
n
<-
queryNodeTable
-<
()
nc
<-
optionalRestrict
queryNodeContextTable
-<
(
\
nc'
->
(
nc'
^.
nc_node_id
)
.==
(
n
^.
node_id
))
returnA
-<
(
n
,
nc
)
------------------------------------------------------------------------
...
...
@@ -298,8 +305,8 @@ selectPublicContexts :: HasDBid NodeType => (Hyperdata a, DefaultFromField SqlJs
=>
Cmd
err
[(
Node
a
,
Maybe
Int
)]
selectPublicContexts
=
runOpaQuery
(
queryWithType
NodeFolderPublic
)
queryWithType
::
HasDBid
NodeType
=>
NodeType
->
O
.
Select
(
NodeRead
,
Column
(
Nullable
SqlInt4
))
queryWithType
::
HasDBid
NodeType
=>
NodeType
->
O
.
Select
(
NodeRead
,
MaybeFields
(
Field
SqlInt4
))
queryWithType
nt
=
proc
()
->
do
(
n
,
nc
)
<-
joinOn1
-<
()
restrict
-<
n
^.
node_typename
.==
(
sqlInt4
$
toDBid
nt
)
returnA
-<
(
n
,
nc
^.
nc_context_id
)
restrict
-<
(
n
^.
node_typename
)
.==
sqlInt4
(
toDBid
nt
)
returnA
-<
(
n
,
view
nc_context_id
<$>
nc
)
src/Gargantext/Database/Query/Table/NodeNode.hs
View file @
ca17a524
{-| Module : Gargantext.Database.Select.Table.NodeNode
Description :
Description :
Copyright : (c) CNRS, 2017-Present
License : AGPL + CECILL v3
Maintainer : team@gargantext.org
...
...
@@ -14,6 +14,7 @@ commentary with @some markup@.
{-# LANGUAGE Arrows #-}
{-# LANGUAGE FunctionalDependencies #-}
{-# LANGUAGE LambdaCase #-}
{-# LANGUAGE QuasiQuotes #-}
{-# LANGUAGE TemplateHaskell #-}
...
...
@@ -135,7 +136,7 @@ nodeNodesCategory :: [(CorpusId, DocId, Int)] -> Cmd err [Int]
nodeNodesCategory
inputData
=
map
(
\
(
PGS
.
Only
a
)
->
a
)
<$>
runPGSQuery
catQuery
(
PGS
.
Only
$
Values
fields
inputData
)
where
fields
=
map
(
\
t
->
QualifiedIdentifier
Nothing
t
)
[
"int4"
,
"int4"
,
"int4"
]
fields
=
map
(
QualifiedIdentifier
Nothing
)
[
"int4"
,
"int4"
,
"int4"
]
catQuery
::
PGS
.
Query
catQuery
=
[
sql
|
UPDATE nodes_nodes as nn0
SET category = nn1.category
...
...
@@ -160,7 +161,7 @@ nodeNodesScore :: [(CorpusId, DocId, Int)] -> Cmd err [Int]
nodeNodesScore
inputData
=
map
(
\
(
PGS
.
Only
a
)
->
a
)
<$>
runPGSQuery
catScore
(
PGS
.
Only
$
Values
fields
inputData
)
where
fields
=
map
(
\
t
->
QualifiedIdentifier
Nothing
t
)
[
"int4"
,
"int4"
,
"int4"
]
fields
=
map
(
QualifiedIdentifier
Nothing
)
[
"int4"
,
"int4"
,
"int4"
]
catScore
::
PGS
.
Query
catScore
=
[
sql
|
UPDATE nodes_nodes as nn0
SET score = nn1.score
...
...
@@ -176,9 +177,11 @@ _selectCountDocs cId = runCountOpaQuery (queryCountDocs cId)
where
queryCountDocs
cId'
=
proc
()
->
do
(
n
,
nn
)
<-
joinInCorpus
-<
()
restrict
-<
nn
^.
nn_node1_id
.==
(
toNullable
$
pgNodeId
cId'
)
restrict
-<
nn
^.
nn_category
.>=
(
toNullable
$
sqlInt4
1
)
restrict
-<
n
^.
node_typename
.==
(
sqlInt4
$
toDBid
NodeDocument
)
restrict
-<
matchMaybe
nn
$
\
case
Nothing
->
toFields
True
Just
nn'
->
(
nn'
^.
nn_node1_id
)
.==
pgNodeId
cId'
.&&
(
nn'
^.
nn_category
)
.>=
sqlInt4
1
restrict
-<
n
^.
node_typename
.==
sqlInt4
(
toDBid
NodeDocument
)
returnA
-<
n
...
...
@@ -197,10 +200,12 @@ selectDocs cId = runOpaQuery (queryDocs cId)
queryDocs
::
HasDBid
NodeType
=>
CorpusId
->
O
.
Select
(
Column
SqlJsonb
)
queryDocs
cId
=
proc
()
->
do
(
n
,
nn
)
<-
joinInCorpus
-<
()
restrict
-<
nn
^.
nn_node1_id
.==
(
toNullable
$
pgNodeId
cId
)
restrict
-<
nn
^.
nn_category
.>=
(
toNullable
$
sqlInt4
1
)
restrict
-<
n
^.
node_typename
.==
(
sqlInt4
$
toDBid
NodeDocument
)
returnA
-<
view
(
node_hyperdata
)
n
restrict
-<
matchMaybe
nn
$
\
case
Nothing
->
toFields
True
Just
nn'
->
(
nn'
^.
nn_node1_id
)
.==
pgNodeId
cId
.&&
(
nn'
^.
nn_category
)
.>=
sqlInt4
1
restrict
-<
n
^.
node_typename
.==
(
sqlInt4
$
toDBid
NodeDocument
)
returnA
-<
view
node_hyperdata
n
selectDocNodes
::
HasDBid
NodeType
=>
CorpusId
->
Cmd
err
[
Node
HyperdataDocument
]
selectDocNodes
cId
=
runOpaQuery
(
queryDocNodes
cId
)
...
...
@@ -208,22 +213,19 @@ selectDocNodes cId = runOpaQuery (queryDocNodes cId)
queryDocNodes
::
HasDBid
NodeType
=>
CorpusId
->
O
.
Select
NodeRead
queryDocNodes
cId
=
proc
()
->
do
(
n
,
nn
)
<-
joinInCorpus
-<
()
restrict
-<
nn
^.
nn_node1_id
.==
(
toNullable
$
pgNodeId
cId
)
restrict
-<
nn
^.
nn_category
.>=
(
toNullable
$
sqlInt4
1
)
restrict
-<
n
^.
node_typename
.==
(
sqlInt4
$
toDBid
NodeDocument
)
restrict
-<
matchMaybe
nn
$
\
case
Nothing
->
toFields
True
Just
nn'
->
(
nn'
^.
nn_node1_id
.==
pgNodeId
cId
)
.&&
(
nn'
^.
nn_category
)
.>=
sqlInt4
1
restrict
-<
n
^.
node_typename
.==
sqlInt4
(
toDBid
NodeDocument
)
returnA
-<
n
joinInCorpus
::
O
.
Select
(
NodeRead
,
NodeNodeReadNull
)
joinInCorpus
=
leftJoin
queryNodeTable
queryNodeNodeTable
cond
where
cond
::
(
NodeRead
,
NodeNodeRead
)
->
Column
SqlBool
cond
(
n
,
nn
)
=
nn
^.
nn_node2_id
.==
(
view
node_id
n
)
_joinOn1
::
O
.
Select
(
NodeRead
,
NodeNodeReadNull
)
_joinOn1
=
leftJoin
queryNodeTable
queryNodeNodeTable
cond
where
cond
::
(
NodeRead
,
NodeNodeRead
)
->
Column
SqlBool
cond
(
n
,
nn
)
=
nn
^.
nn_node1_id
.==
n
^.
node_id
joinInCorpus
::
O
.
Select
(
NodeRead
,
MaybeFields
NodeNodeRead
)
joinInCorpus
=
proc
()
->
do
n
<-
queryNodeTable
-<
()
nn
<-
optionalRestrict
queryNodeNodeTable
-<
(
\
nn'
->
(
nn'
^.
nn_node2_id
)
.==
view
node_id
n
)
returnA
-<
(
n
,
nn
)
------------------------------------------------------------------------
...
...
@@ -233,17 +235,15 @@ selectPublicNodes = runOpaQuery (queryWithType NodeFolderPublic)
queryWithType
::
HasDBid
NodeType
=>
NodeType
->
O
.
Select
(
NodeRead
,
Column
(
Nullable
SqlInt4
))
->
O
.
Select
(
NodeRead
,
MaybeFields
(
Column
SqlInt4
))
queryWithType
nt
=
proc
()
->
do
(
n
,
nn
)
<-
node_NodeNode
-<
()
restrict
-<
n
^.
node_typename
.==
(
sqlInt4
$
toDBid
nt
)
returnA
-<
(
n
,
nn
^.
nn_node2_id
)
node_NodeNode
::
O
.
Select
(
NodeRead
,
NodeNodeReadNull
)
node_NodeNode
=
leftJoin
queryNodeTable
queryNodeNodeTable
cond
where
cond
::
(
NodeRead
,
NodeNodeRead
)
->
Column
SqlBool
cond
(
n
,
nn
)
=
nn
^.
nn_node1_id
.==
n
^.
node_id
(
n
,
nn_node2_id'
)
<-
node_NodeNode
-<
()
restrict
-<
n
^.
node_typename
.==
sqlInt4
(
toDBid
nt
)
returnA
-<
(
n
,
nn_node2_id'
)
node_NodeNode
::
O
.
Select
(
NodeRead
,
MaybeFields
(
Field
SqlInt4
))
node_NodeNode
=
proc
()
->
do
n
<-
queryNodeTable
-<
()
nn
<-
optionalRestrict
queryNodeNodeTable
-<
(
\
nn'
->
(
nn'
^.
nn_node1_id
)
.==
(
n
^.
node_id
))
returnA
-<
(
n
,
view
nn_node2_id
<$>
nn
)
src/Gargantext/Database/Query/Table/User.hs
View file @
ca17a524
...
...
@@ -96,7 +96,7 @@ updateUserDB us = mkCmd $ \c -> runUpdate_ c (updateUserQuery us)
-----------------------------------------------------------------------
toUserWrite
::
NewUser
HashPassword
->
UserWrite
toUserWrite
(
NewUser
u
m
(
Auth
.
PasswordHash
p
))
=
toUserWrite
(
NewUser
u
m
(
Auth
.
PasswordHash
p
))
=
UserDB
{
user_id
=
Nothing
,
user_password
=
sqlStrictText
p
,
user_lastLogin
=
Nothing
...
...
src/Gargantext/Database/Schema/Context.hs
View file @
ca17a524
...
...
@@ -72,68 +72,68 @@ contextTable = Table "contexts" (pContext Context { _context_id = option
queryContextTable
::
Query
ContextRead
queryContextTable
=
selectTable
contextTable
------------------------------------------------------------------------
type
ContextWrite
=
ContextPoly
(
Maybe
(
Column
SqlInt4
)
)
(
Maybe
(
Column
SqlText
)
)
(
Column
SqlInt4
)
(
Column
SqlInt4
)
(
Maybe
(
Column
SqlInt4
)
)
(
Column
SqlText
)
(
Maybe
(
Column
SqlTimestamptz
))
(
Column
SqlJsonb
)
type
ContextRead
=
ContextPoly
(
Column
SqlInt4
)
(
Column
SqlText
)
(
Column
SqlInt4
)
(
Column
SqlInt4
)
(
Column
SqlInt4
)
(
Column
SqlText
)
(
Column
SqlTimestamptz
)
(
Column
SqlJsonb
)
type
ContextReadNull
=
ContextPoly
(
Column
(
Nullable
SqlInt4
)
)
(
Column
(
Nullable
SqlText
)
)
(
Column
(
Nullable
SqlInt4
)
)
(
Column
(
Nullable
SqlInt4
)
)
(
Column
(
Nullable
SqlInt4
)
)
(
Column
(
Nullable
SqlText
)
)
(
Column
(
Nullable
SqlTimestamptz
)
)
(
Column
(
Nullable
SqlJsonb
)
)
type
ContextWrite
=
ContextPoly
(
Maybe
(
Field
SqlInt4
)
)
(
Maybe
(
Field
SqlText
)
)
(
Field
SqlInt4
)
(
Field
SqlInt4
)
(
Maybe
(
Field
SqlInt4
)
)
(
Field
SqlText
)
(
Maybe
(
Field
SqlTimestamptz
))
(
Field
SqlJsonb
)
type
ContextRead
=
ContextPoly
(
Field
SqlInt4
)
(
Field
SqlText
)
(
Field
SqlInt4
)
(
Field
SqlInt4
)
(
Field
SqlInt4
)
(
Field
SqlText
)
(
Field
SqlTimestamptz
)
(
Field
SqlJsonb
)
type
ContextReadNull
=
ContextPoly
(
FieldNullable
SqlInt4
)
(
FieldNullable
SqlText
)
(
FieldNullable
SqlInt4
)
(
FieldNullable
SqlInt4
)
(
FieldNullable
SqlInt4
)
(
FieldNullable
SqlText
)
(
FieldNullable
SqlTimestamptz
)
(
FieldNullable
SqlJsonb
)
------------------------------------------------------------------------
-- | Context(Read|Write)Search is slower than Context(Write|Read) use it
-- for full text search only
type
ContextSearchWrite
=
ContextPolySearch
(
Maybe
(
Column
SqlInt4
)
)
(
Column
SqlInt4
)
(
Column
SqlInt4
)
(
Column
(
Nullable
SqlInt4
)
)
(
Column
SqlText
)
(
Maybe
(
Column
SqlTimestamptz
))
(
Column
SqlJsonb
)
(
Maybe
(
Column
SqlTSVector
)
)
(
Maybe
(
Field
SqlInt4
)
)
(
Field
SqlInt4
)
(
Field
SqlInt4
)
(
FieldNullable
SqlInt4
)
(
Field
SqlText
)
(
Maybe
(
Field
SqlTimestamptz
))
(
Field
SqlJsonb
)
(
Maybe
(
Field
SqlTSVector
)
)
type
ContextSearchRead
=
ContextPolySearch
(
Column
SqlInt4
)
(
Column
SqlInt4
)
(
Column
SqlInt4
)
(
Column
(
Nullable
SqlInt4
)
)
(
Column
SqlText
)
(
Column
SqlTimestamptz
)
(
Column
SqlJsonb
)
(
Column
SqlTSVector
)
(
Field
SqlInt4
)
(
Field
SqlInt4
)
(
Field
SqlInt4
)
(
FieldNullable
SqlInt4
)
(
Field
SqlText
)
(
Field
SqlTimestamptz
)
(
Field
SqlJsonb
)
(
Field
SqlTSVector
)
type
ContextSearchReadNull
=
ContextPolySearch
(
Column
(
Nullable
SqlInt4
)
)
(
Column
(
Nullable
SqlInt4
)
)
(
Column
(
Nullable
SqlInt4
)
)
(
Column
(
Nullable
SqlInt4
)
)
(
Column
(
Nullable
SqlText
)
)
(
Column
(
Nullable
SqlTimestamptz
)
)
(
Column
(
Nullable
SqlJsonb
)
)
(
Column
(
Nullable
SqlTSVector
)
)
(
FieldNullable
SqlInt4
)
(
FieldNullable
SqlInt4
)
(
FieldNullable
SqlInt4
)
(
FieldNullable
SqlInt4
)
(
FieldNullable
SqlText
)
(
FieldNullable
SqlTimestamptz
)
(
FieldNullable
SqlJsonb
)
(
FieldNullable
SqlTSVector
)
data
ContextPolySearch
id
...
...
src/Gargantext/Database/Schema/ContextNodeNgrams.hs
View file @
ca17a524
...
...
@@ -40,28 +40,28 @@ data ContextNodeNgramsPoly c n ngrams_id ngt w dc
}
deriving
(
Show
)
type
ContextNodeNgramsWrite
=
ContextNodeNgramsPoly
(
Column
SqlInt4
)
(
Column
SqlInt4
)
(
Column
SqlInt4
)
(
Column
SqlInt4
)
(
Column
SqlFloat8
)
(
Column
SqlInt4
)
ContextNodeNgramsPoly
(
Field
SqlInt4
)
(
Field
SqlInt4
)
(
Field
SqlInt4
)
(
Field
SqlInt4
)
(
Field
SqlFloat8
)
(
Field
SqlInt4
)
type
ContextNodeNgramsRead
=
ContextNodeNgramsPoly
(
Column
SqlInt4
)
(
Column
SqlInt4
)
(
Column
SqlInt4
)
(
Column
SqlInt4
)
(
Column
SqlFloat8
)
(
Column
SqlInt4
)
ContextNodeNgramsPoly
(
Field
SqlInt4
)
(
Field
SqlInt4
)
(
Field
SqlInt4
)
(
Field
SqlInt4
)
(
Field
SqlFloat8
)
(
Field
SqlInt4
)
type
ContextNodeNgramsReadNull
=
ContextNodeNgramsPoly
(
Column
(
Nullable
SqlInt4
)
)
(
Column
(
Nullable
SqlInt4
)
)
(
Column
(
Nullable
SqlInt4
)
)
(
Column
(
Nullable
SqlInt4
)
)
(
Column
(
Nullable
SqlFloat8
)
)
(
Column
(
Nullable
SqlInt4
)
)
ContextNodeNgramsPoly
(
FieldNullable
SqlInt4
)
(
FieldNullable
SqlInt4
)
(
FieldNullable
SqlInt4
)
(
FieldNullable
SqlInt4
)
(
FieldNullable
SqlFloat8
)
(
FieldNullable
SqlInt4
)
$
(
makeAdaptorAndInstance
"pContextNodeNgrams"
''
C
ontextNodeNgramsPoly
)
makeLenses
''
C
ontextNodeNgramsPoly
...
...
@@ -78,3 +78,6 @@ contextNodeNgramsTable = Table "context_node_ngrams"
,
_cnng_doc_count
=
requiredTableField
"doc_count"
}
)
-- queryContextNodeNgramsTable :: Select ContextNodeNgramsRead
-- queryContextNodeNgramsTable = selectTable contextNodeNgramsTable
src/Gargantext/Database/Schema/Ngrams.hs
View file @
ca17a524
...
...
@@ -33,7 +33,7 @@ import Data.Text (Text, splitOn, pack, strip)
import
Database.PostgreSQL.Simple.FromField
(
returnError
,
ResultError
(
..
))
import
Gargantext.Core
(
HasDBid
(
..
))
import
Gargantext.Core.Types
(
TODO
(
..
),
Typed
(
..
))
import
Gargantext.Database.Schema.Prelude
import
Gargantext.Database.Schema.Prelude
hiding
(
over
)
import
Gargantext.Database.Types
import
Gargantext.Prelude
import
Servant
(
FromHttpApiData
(
..
),
Proxy
(
..
),
ToHttpApiData
(
..
))
...
...
@@ -52,17 +52,17 @@ data NgramsPoly id terms n = NgramsDB { _ngrams_id :: !id
,
_ngrams_n
::
!
n
}
deriving
(
Show
)
type
NgramsWrite
=
NgramsPoly
(
Maybe
(
Column
SqlInt4
))
(
Column
SqlText
)
(
Column
SqlInt4
)
type
NgramsWrite
=
NgramsPoly
(
Maybe
(
Field
SqlInt4
))
(
Field
SqlText
)
(
Field
SqlInt4
)
type
NgramsRead
=
NgramsPoly
(
Column
SqlInt4
)
(
Column
SqlText
)
(
Column
SqlInt4
)
type
NgramsRead
=
NgramsPoly
(
Field
SqlInt4
)
(
Field
SqlText
)
(
Field
SqlInt4
)
type
NgramsReadNull
=
NgramsPoly
(
Column
(
Nullable
SqlInt4
)
)
(
Column
(
Nullable
SqlText
)
)
(
Column
(
Nullable
SqlInt4
)
)
type
NgramsReadNull
=
NgramsPoly
(
FieldNullable
SqlInt4
)
(
FieldNullable
SqlText
)
(
FieldNullable
SqlInt4
)
type
NgramsDB
=
NgramsPoly
Int
Text
Int
...
...
@@ -155,10 +155,10 @@ instance DefaultFromField (Nullable SqlInt4) NgramsTypeId
where
defaultFromField
=
fromPGSFromField
pgNgramsType
::
NgramsType
->
Column
SqlInt4
pgNgramsType
::
NgramsType
->
Field
SqlInt4
pgNgramsType
=
pgNgramsTypeId
.
ngramsTypeId
pgNgramsTypeId
::
NgramsTypeId
->
Column
SqlInt4
pgNgramsTypeId
::
NgramsTypeId
->
Field
SqlInt4
pgNgramsTypeId
(
NgramsTypeId
n
)
=
sqlInt4
n
ngramsTypeId
::
NgramsType
->
NgramsTypeId
...
...
src/Gargantext/Database/Schema/Node.hs
View file @
ca17a524
...
...
@@ -72,68 +72,68 @@ nodeTable = Table "nodes" (pNode Node { _node_id = optionalTableField "i
queryNodeTable
::
Query
NodeRead
queryNodeTable
=
selectTable
nodeTable
------------------------------------------------------------------------
type
NodeWrite
=
NodePoly
(
Maybe
(
Column
SqlInt4
)
)
(
Maybe
(
Column
SqlText
)
)
(
Column
SqlInt4
)
(
Column
SqlInt4
)
(
Maybe
(
Column
SqlInt4
)
)
(
Column
SqlText
)
(
Maybe
(
Column
SqlTimestamptz
))
(
Column
SqlJsonb
)
type
NodeRead
=
NodePoly
(
Column
SqlInt4
)
(
Column
SqlText
)
(
Column
SqlInt4
)
(
Column
SqlInt4
)
(
Column
SqlInt4
)
(
Column
SqlText
)
(
Column
SqlTimestamptz
)
(
Column
SqlJsonb
)
type
NodeReadNull
=
NodePoly
(
Column
(
Nullable
SqlInt4
)
)
(
Column
(
Nullable
SqlText
)
)
(
Column
(
Nullable
SqlInt4
)
)
(
Column
(
Nullable
SqlInt4
)
)
(
Column
(
Nullable
SqlInt4
)
)
(
Column
(
Nullable
SqlText
)
)
(
Column
(
Nullable
SqlTimestamptz
)
)
(
Column
(
Nullable
SqlJsonb
)
)
type
NodeWrite
=
NodePoly
(
Maybe
(
Field
SqlInt4
)
)
(
Maybe
(
Field
SqlText
)
)
(
Field
SqlInt4
)
(
Field
SqlInt4
)
(
Maybe
(
Field
SqlInt4
)
)
(
Field
SqlText
)
(
Maybe
(
Field
SqlTimestamptz
))
(
Field
SqlJsonb
)
type
NodeRead
=
NodePoly
(
Field
SqlInt4
)
(
Field
SqlText
)
(
Field
SqlInt4
)
(
Field
SqlInt4
)
(
Field
SqlInt4
)
(
Field
SqlText
)
(
Field
SqlTimestamptz
)
(
Field
SqlJsonb
)
type
NodeReadNull
=
NodePoly
(
FieldNullable
SqlInt4
)
(
FieldNullable
SqlText
)
(
Field
SqlInt4
)
(
Field
SqlInt4
)
(
FieldNullable
SqlInt4
)
(
Field
SqlText
)
(
FieldNullable
SqlTimestamptz
)
(
Field
SqlJsonb
)
------------------------------------------------------------------------
-- | Node(Read|Write)Search is slower than Node(Write|Read) use it
-- for full text search only
type
NodeSearchWrite
=
NodePolySearch
(
Maybe
(
Column
SqlInt4
)
)
(
Column
SqlInt4
)
(
Column
SqlInt4
)
(
Column
(
Nullable
SqlInt4
)
)
(
Column
SqlText
)
(
Maybe
(
Column
SqlTimestamptz
))
(
Column
SqlJsonb
)
(
Maybe
(
Column
SqlTSVector
)
)
(
Maybe
(
Field
SqlInt4
)
)
(
Field
SqlInt4
)
(
Field
SqlInt4
)
(
FieldNullable
SqlInt4
)
(
Field
SqlText
)
(
Maybe
(
Field
SqlTimestamptz
))
(
Field
SqlJsonb
)
(
Maybe
(
Field
SqlTSVector
)
)
type
NodeSearchRead
=
NodePolySearch
(
Column
SqlInt4
)
(
Column
SqlInt4
)
(
Column
SqlInt4
)
(
Column
(
Nullable
SqlInt4
)
)
(
Column
SqlText
)
(
Column
SqlTimestamptz
)
(
Column
SqlJsonb
)
(
Column
SqlTSVector
)
(
Field
SqlInt4
)
(
Field
SqlInt4
)
(
Field
SqlInt4
)
(
FieldNullable
SqlInt4
)
(
Field
SqlText
)
(
Field
SqlTimestamptz
)
(
Field
SqlJsonb
)
(
Field
SqlTSVector
)
type
NodeSearchReadNull
=
NodePolySearch
(
Column
(
Nullable
SqlInt4
)
)
(
Column
(
Nullable
SqlInt4
)
)
(
Column
(
Nullable
SqlInt4
)
)
(
Column
(
Nullable
SqlInt4
)
)
(
Column
(
Nullable
SqlText
)
)
(
Column
(
Nullable
SqlTimestamptz
)
)
(
Column
(
Nullable
SqlJsonb
)
)
(
Column
(
Nullable
SqlTSVector
)
)
(
FieldNullable
SqlInt4
)
(
FieldNullable
SqlInt4
)
(
FieldNullable
SqlInt4
)
(
FieldNullable
SqlInt4
)
(
FieldNullable
SqlText
)
(
FieldNullable
SqlTimestamptz
)
(
FieldNullable
SqlJsonb
)
(
FieldNullable
SqlTSVector
)
data
NodePolySearch
id
...
...
src/Gargantext/Database/Schema/NodeContext.hs
View file @
ca17a524
...
...
@@ -34,23 +34,23 @@ data NodeContextPoly id node_id context_id score cat
,
_nc_category
::
!
cat
}
deriving
(
Show
)
type
NodeContextWrite
=
NodeContextPoly
(
Maybe
(
Column
(
SqlInt4
)
))
(
Column
(
SqlInt4
)
)
(
Column
(
SqlInt4
)
)
(
Maybe
(
Column
(
SqlFloat8
)
))
(
Maybe
(
Column
(
SqlInt4
)
))
type
NodeContextWrite
=
NodeContextPoly
(
Maybe
(
Field
SqlInt4
))
(
Field
SqlInt4
)
(
Field
SqlInt4
)
(
Maybe
(
Field
SqlFloat8
))
(
Maybe
(
Field
SqlInt4
))
type
NodeContextRead
=
NodeContextPoly
(
Column
(
SqlInt4
)
)
(
Column
(
SqlInt4
)
)
(
Column
(
SqlInt4
)
)
(
Column
(
SqlFloat8
)
)
(
Column
(
SqlInt4
)
)
type
NodeContextRead
=
NodeContextPoly
(
Field
SqlInt4
)
(
Field
SqlInt4
)
(
Field
SqlInt4
)
(
Field
SqlFloat8
)
(
Field
SqlInt4
)
type
NodeContextReadNull
=
NodeContextPoly
(
Column
(
Nullable
SqlInt4
)
)
(
Column
(
Nullable
SqlInt4
)
)
(
Column
(
Nullable
SqlInt4
)
)
(
Column
(
Nullable
SqlFloat8
)
)
(
Column
(
Nullable
SqlInt4
)
)
type
NodeContextReadNull
=
NodeContextPoly
(
FieldNullable
SqlInt4
)
(
FieldNullable
SqlInt4
)
(
FieldNullable
SqlInt4
)
(
FieldNullable
SqlFloat8
)
(
FieldNullable
SqlInt4
)
type
NodeContext
=
NodeContextPoly
(
Maybe
Int
)
NodeId
NodeId
(
Maybe
Double
)
(
Maybe
Int
)
...
...
src/Gargantext/Database/Schema/NodeNode.hs
View file @
ca17a524
{-|
Module : Gargantext.Database.Schema.NodeNode
Description :
Description :
Copyright : (c) CNRS, 2017-Present
License : AGPL + CECILL v3
Maintainer : team@gargantext.org
...
...
@@ -30,20 +30,20 @@ data NodeNodePoly node1_id node2_id score cat
,
_nn_category
::
!
cat
}
deriving
(
Show
)
type
NodeNodeWrite
=
NodeNodePoly
(
Column
(
SqlInt4
)
)
(
Column
(
SqlInt4
)
)
(
Maybe
(
Column
(
SqlFloat8
)
))
(
Maybe
(
Column
(
SqlInt4
)
))
type
NodeNodeWrite
=
NodeNodePoly
(
Field
SqlInt4
)
(
Field
SqlInt4
)
(
Maybe
(
Field
SqlFloat8
))
(
Maybe
(
Field
SqlInt4
))
type
NodeNodeRead
=
NodeNodePoly
(
Column
(
SqlInt4
)
)
(
Column
(
SqlInt4
)
)
(
Column
(
SqlFloat8
)
)
(
Column
(
SqlInt4
)
)
type
NodeNodeRead
=
NodeNodePoly
(
Field
SqlInt4
)
(
Field
SqlInt4
)
(
Field
SqlFloat8
)
(
Field
SqlInt4
)
type
NodeNodeReadNull
=
NodeNodePoly
(
Column
(
Nullable
SqlInt4
)
)
(
Column
(
Nullable
SqlInt4
)
)
(
Column
(
Nullable
SqlFloat8
)
)
(
Column
(
Nullable
SqlInt4
)
)
type
NodeNodeReadNull
=
NodeNodePoly
(
Field
SqlInt4
)
(
Field
SqlInt4
)
(
FieldNullable
SqlFloat8
)
(
FieldNullable
SqlInt4
)
type
NodeNode
=
NodeNodePoly
NodeId
NodeId
(
Maybe
Double
)
(
Maybe
Int
)
...
...
@@ -60,4 +60,3 @@ nodeNodeTable =
,
_nn_category
=
optionalTableField
"category"
}
)
src/Gargantext/Database/Schema/User.hs
View file @
ca17a524
...
...
@@ -99,11 +99,11 @@ type UserRead = UserPoly (Column SqlInt4) (Column SqlText)
(
Column
SqlTimestamptz
)
(
Column
SqlText
)
type
UserReadNull
=
UserPoly
(
Column
(
Nullable
SqlInt4
))
(
Column
(
Nullable
SqlText
)
)
(
Column
(
Nullable
SqlTimestamptz
))
(
Column
(
Nullable
SqlBool
)
)
(
Column
(
Nullable
SqlText
))
(
Column
(
Nullable
SqlText
)
)
(
Column
(
Nullable
SqlText
))
(
Column
(
Nullable
SqlText
)
)
(
Column
(
Nullable
SqlBool
))
(
Column
(
Nullable
SqlBool
)
)
type
UserReadNull
=
UserPoly
(
Column
(
Nullable
SqlInt4
))
(
Column
SqlText
)
(
Column
(
Nullable
SqlTimestamptz
))
(
Column
SqlBool
)
(
Column
SqlText
)
(
Column
SqlText
)
(
Column
SqlText
)
(
Column
SqlText
)
(
Column
SqlBool
)
(
Column
SqlBool
)
(
Column
(
Nullable
SqlTimestamptz
))
(
Column
(
Nullable
SqlText
))
...
...
src/Gargantext/Utils/Tuple.hs
0 → 100644
View file @
ca17a524
module
Gargantext.Utils.Tuple
where
import
Protolude
uncurryMaybe
::
(
Maybe
a
,
Maybe
b
)
->
Maybe
(
a
,
b
)
uncurryMaybe
(
Nothing
,
_
)
=
Nothing
uncurryMaybe
(
_
,
Nothing
)
=
Nothing
uncurryMaybe
(
Just
a
,
Just
b
)
=
Just
(
a
,
b
)
stack.yaml
View file @
ca17a524
...
...
@@ -64,8 +64,8 @@ extra-deps:
commit
:
fd7e5d7325939103cd87d0dc592faf644160341c
# Databases libs
-
git
:
https://github.com/
delanoe
/haskell-opaleye.git
commit
:
756cb90f4ce725463d957bc899d764e0ed73738c
-
git
:
https://github.com/
garganscript
/haskell-opaleye.git
commit
:
18c4958e076f5f8f82a4e4a3fc9ec659d2bd8766
-
git
:
https://github.com/delanoe/hsparql.git
commit
:
2acbbc55ac9bbd4bf1a713c586b8b8e8b82892eb
-
git
:
https://github.com/robstewart57/rdf4h.git
...
...
@@ -73,8 +73,7 @@ extra-deps:
# External Data API connectors
-
git
:
https://gitlab.iscpif.fr/gargantext/crawlers/pubmed.git
commit
:
31cb4d28dcb5d17274cede5e67b2a01914379129
#commit: 364885c891cbadcd4d8a623d2e41394b09f653aa
commit
:
4ade495751eaf31d3ca1ac8b0ae13d3538c6e18c
-
git
:
https://gitlab.iscpif.fr/gargantext/crawlers/istex.git
commit
:
a34bb341236d82cf3d488210bc1d8448a98f5808
-
git
:
https://gitlab.iscpif.fr/gargantext/crawlers/hal.git
...
...
@@ -177,3 +176,4 @@ ghc-options:
hmatrix
:
-O2 -fsimpl-tick-factor=10000 -fdicts-cheap -fdicts-strict -flate-dmd-anal -fno-state-hack
sparse-linear
:
-O2 -fsimpl-tick-factor=10000 -fdicts-cheap -fdicts-strict -flate-dmd-anal -fno-state-hack
gargantext-graph
:
-O2 -fsimpl-tick-factor=10000 -fdicts-cheap -fdicts-strict -flate-dmd-anal -fno-state-hack
"
$locals"
:
-fwrite-ide-info -hiedir=".stack-work/hiedb"
weeder.dhall
0 → 100644
View file @
ca17a524
{ roots = [ "^Main\\.main\$"
, "^Paths_.*"
], type-class-roots = True }
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment