Skip to content
Projects
Groups
Snippets
Help
Loading...
Help
Submit feedback
Contribute to GitLab
Sign in
Toggle navigation
haskell-gargantext
Project
Project
Details
Activity
Releases
Cycle Analytics
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Charts
Issues
158
Issues
158
List
Board
Labels
Milestones
Merge Requests
10
Merge Requests
10
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Charts
Wiki
Wiki
Snippets
Snippets
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Charts
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
gargantext
haskell-gargantext
Commits
cbe804f5
Commit
cbe804f5
authored
Oct 09, 2018
by
Alexandre Delanoë
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
[Cosmetics]
parent
91a59bee
Changes
1
Show whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
8 additions
and
5 deletions
+8
-5
Stop.hs
src/Gargantext/Text/Terms/Stop.hs
+8
-5
No files found.
src/Gargantext/Text/Terms/Stop.hs
View file @
cbe804f5
...
@@ -60,12 +60,17 @@ blanks xs = [' '] <> xs <> [' ']
...
@@ -60,12 +60,17 @@ blanks xs = [' '] <> xs <> [' ']
-- | Blocks increase the size of the word to ease computations
-- | Blocks increase the size of the word to ease computations
-- some border and unexepected effects can happen, need to be tested
-- some border and unexepected effects can happen, need to be tested
blockOf
::
Int
->
String
->
String
blockOf
::
Int
->
String
->
String
blockOf
n
st
=
DL
.
concat
$
DL
.
take
n
$
DL
.
repeat
s
t
blockOf
n
=
DL
.
concat
.
DL
.
take
n
.
DL
.
repea
t
-- | Chunks is the same function as splitBy in Context but for Strings,
-- | Chunks is the same function as splitBy in Context but for Strings,
-- not Text (without pack and unpack operations that are not needed).
-- not Text (without pack and unpack operations that are not needed).
chunks
::
Int
->
Int
->
String
->
[
String
]
chunks
::
Int
->
Int
->
String
->
[
String
]
chunks
n
m
=
DL
.
take
m
.
filter
(
not
.
all
(
==
' '
))
.
chunkAlong
(
n
+
1
)
1
.
DL
.
concat
.
DL
.
take
1000
.
DL
.
repeat
.
blanks
chunks
n
m
=
DL
.
take
m
.
filter
(
not
.
all
(
==
' '
))
.
chunkAlong
(
n
+
1
)
1
.
DL
.
concat
.
DL
.
take
1000
.
DL
.
repeat
.
blanks
allChunks
::
[
Int
]
->
Int
->
String
->
[
String
]
allChunks
::
[
Int
]
->
Int
->
String
->
[
String
]
allChunks
ns
m
st
=
DL
.
concat
$
map
(
\
n
->
chunks
n
m
st
)
ns
allChunks
ns
m
st
=
DL
.
concat
$
map
(
\
n
->
chunks
n
m
st
)
ns
...
@@ -159,7 +164,6 @@ toPrior s el = prior $ pebLang s el
...
@@ -159,7 +164,6 @@ toPrior s el = prior $ pebLang s el
pebLang
::
String
->
EventLang
->
[(
Lang
,
(
Freq
,
TotalFreq
))]
pebLang
::
String
->
EventLang
->
[(
Lang
,
(
Freq
,
TotalFreq
))]
pebLang
st
=
map
(
\
(
l
,
eb
)
->
(
l
,
peb'
st
eb
))
.
DM
.
toList
pebLang
st
=
map
(
\
(
l
,
eb
)
->
(
l
,
peb'
st
eb
))
.
DM
.
toList
------------------------------------------------------------------------
------------------------------------------------------------------------
prior
::
[(
Lang
,
(
Freq
,
TotalFreq
))]
->
[(
Lang
,
Double
)]
prior
::
[(
Lang
,
(
Freq
,
TotalFreq
))]
->
[(
Lang
,
Double
)]
prior
ps
=
zip
ls
$
zipWith
(
\
x
y
->
x
^
(
99
::
Int
)
*
y
)
(
map
(
\
(
a
,
_
)
->
part
a
(
sum
$
map
fst
ps'
))
ps'
)
prior
ps
=
zip
ls
$
zipWith
(
\
x
y
->
x
^
(
99
::
Int
)
*
y
)
(
map
(
\
(
a
,
_
)
->
part
a
(
sum
$
map
fst
ps'
))
ps'
)
...
@@ -195,7 +199,6 @@ op f (EventBook ef1 en1)
...
@@ -195,7 +199,6 @@ op f (EventBook ef1 en1)
(
EventBook
ef2
en2
)
=
EventBook
(
DM
.
unionWith
f
ef1
ef2
)
(
EventBook
ef2
en2
)
=
EventBook
(
DM
.
unionWith
f
ef1
ef2
)
(
DM
.
unionWith
f
en1
en2
)
(
DM
.
unionWith
f
en1
en2
)
------------------------------------------------------------------------
------------------------------------------------------------------------
------------------------------------------------------------------------
------------------------------------------------------------------------
-- * Make the distributions
-- * Make the distributions
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment