Skip to content
Projects
Groups
Snippets
Help
Loading...
Help
Submit feedback
Contribute to GitLab
Sign in
Toggle navigation
S
searx
Project
Project
Details
Activity
Releases
Cycle Analytics
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Charts
Issues
1
Issues
1
List
Board
Labels
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Charts
Wiki
Wiki
Snippets
Snippets
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Charts
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
gargantext
crawlers
searx
Commits
a03304f5
Commit
a03304f5
authored
Nov 12, 2019
by
Mudada
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
made searx request recursive
parent
8e257d7c
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
64 additions
and
10 deletions
+64
-10
Main.hs
app/Main.hs
+64
-10
No files found.
app/Main.hs
View file @
a03304f5
...
@@ -5,16 +5,70 @@ module Main where
...
@@ -5,16 +5,70 @@ module Main where
import
SEARX
import
SEARX
import
SEARX.Client
import
SEARX.Client
import
System.Directory
import
System.Directory
import
Text.HTML.TagSoup
import
Data.Maybe
import
qualified
Data.Text
as
T
main
::
IO
()
emptyArticle
::
Article
main
=
do
emptyArticle
=
Article
res
<-
getMetadataWith
"abeille"
1
{
title
=
Nothing
,
byline
=
Nothing
,
dir
=
Nothing
,
content
=
Nothing
,
textContent
=
Nothing
,
SEARX
.
length
=
Nothing
,
excerpt
=
Nothing
,
siteName
=
Nothing
}
type
Depth
=
Int
type
Limit
=
Int
type
Query
=
T
.
Text
searxSearch
::
Query
->
Limit
->
Depth
->
IO
[
Maybe
Article
]
searxSearch
q
l
d
=
do
res
<-
getMetadataWith
q
l
case
res
of
case
res
of
(
Left
err
)
->
print
err
(
Left
err
)
->
return
[]
(
Right
r
)
->
do
(
Right
r
)
->
do
let
urls
=
(
_document_id
)
<$>
(
_documents_hits
r
)
let
urls
=
take
l
$
_document_id
<$>
(
_documents_hits
r
)
fp
<-
setUpDirectory
parseWebsiteReq
d
urls
print
urls
articles
<-
sequence
$
parseWebsite
fp
<$>
urls
parseWebsiteReqWithFp
::
FilePath
->
Depth
->
[
T
.
Text
]
->
IO
[
Maybe
Article
]
removeDirectoryRecursive
$
fp
<>
"/readability"
parseWebsiteReqWithFp
fp
d
urls
print
articles
|
d
<=
0
=
do
art
<-
parseWebsite'
return
art
|
otherwise
=
do
articles
<-
parseWebsite'
d
<-
parseWebsiteReqWithFp
fp
(
d
-
1
)
(
getUrlsFromWebsite
articles
)
return
$
d
<>
articles
where
parseWebsite'
=
sequence
$
parseWebsite
fp
<$>
urls
parseWebsiteReq
::
Depth
->
[
T
.
Text
]
->
IO
[
Maybe
Article
]
parseWebsiteReq
d
urls
=
do
fp
<-
setUpDirectory
articles
<-
parseWebsiteReqWithFp
fp
d
urls
removeDirectoryRecursive
$
fp
<>
"/readability"
return
articles
getUrlsFromWebsite
::
[
Maybe
Article
]
->
[
T
.
Text
]
getUrlsFromWebsite
articles
=
(
filter
(
/=
""
))
$
(
fromAttrib
"href"
)
<$>
(
filter
isTagOpen
$
concat
$
parseTags
<$>
((
fromMaybe
""
.
content
.
fromMaybe
emptyArticle
)
<$>
articles
))
-- searxSearch :: Query -> Limit -> Depth -> IO [Maybe Article]
main
::
IO
()
main
=
do
articles
<-
searxSearch
"abeille"
10
1
print
$
articles
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment