Commit Graph

63 Commits

Author SHA1 Message Date
Émilien (perso) b6e3cf6ced wikipedia wikidata infobox + disable wikisource (#2806)
Co-authored-by: Markus Heiser <markus.heiser@darmarit.de>
2023-10-19 18:03:21 +02:00
jazzzooo b6316020f7 [fix] spelling 2023-10-19 18:02:05 +02:00
Markus Heiser 9b575a997b [fix] doc of locales.get_engine_locale() / zh-classical is missleading
Wikipedia's zh-classical is not zh_Hant (see doc-string of engines.wikipedia).
Fixed the example in the doc-string of locales.get_engine_locale() to 'zh_TW'.

Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
2023-04-17 08:48:57 +02:00
Markus Heiser 27369ebec2 [fix] searxng_extra/update/update_engine_descriptions.py (part 1)
Follow up of #2269

The script to update the descriptions of the engines does no longer work since
PR #2269 has been merged.

searx/engines/wikipedia.py
==========================

1. There was a misusage of zh-classical.wikipedia.org:

   - `zh-classical` is dedicate to classical Chinese [1] which is not
     traditional Chinese [2].

   - zh.wikipedia.org has LanguageConverter enabled [3] and is going to
     dynamically show simplified or traditional Chinese according to the
     HTTP Accept-Language header.

2. The update_engine_descriptions.py needs a list of all wikipedias.  The
   implementation from #2269 included only a reduced list:

   - https://meta.wikimedia.org/wiki/Wikipedia_article_depth
   - https://meta.wikimedia.org/wiki/List_of_Wikipedias

searxng_extra/update/update_engine_descriptions.py
==================================================

Before PR #2269 there was a match_language() function that did an approximation
using various methods.  With PR #2269 there are only the types in the data model
of the languages, which can be recognized by babel.  The approximation methods,
which are needed (only here) in the determination of the descriptions, must be
replaced by other methods.

[1] https://en.wikipedia.org/wiki/Classical_Chinese
[2] https://en.wikipedia.org/wiki/Traditional_Chinese_characters
[3] https://www.mediawiki.org/wiki/Writing_systems#LanguageConverter

Closes: https://github.com/searxng/searxng/issues/2330
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
2023-04-15 16:03:59 +02:00
Markus Heiser 858aa3e604 [mod] wikipedia & wikidata: upgrade to data_type: traits_v1
BTW this fix an issue in wikipedia: SearXNG's locales zh-TW and zh-HK are now
using language `zh-classical` from wikipedia (and not `zh`).

Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
2023-03-24 10:37:42 +01:00
Markus Heiser 7daf4f95ef [mod] Wikipedia: fetch engine traits (data_type: supported_languages)
Implements a fetch_traits function for the Wikipedia engines.

.. note::

   Does not include migration of the request methode from 'supported_languages'
   to 'traits' (EngineTraits) object!

Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
2023-03-24 10:37:42 +01:00
Solirs ac169a0f75 Pass black formatting test 2023-03-21 00:41:36 +01:00
Solirs e26bce33d4 WIKIDATA: Add description for results 2023-03-21 00:14:54 +01:00
Markus Heiser ba8959ad7c [fix] typos / reported by @kianmeng in searx PR-3366
[PR-3366] https://github.com/searx/searx/pull/3366

Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
2022-09-27 18:32:14 +02:00
Alexandre Flament cd2dd5dd55 Wikidata engine: ignore dummy entities
Close #641
2022-06-11 11:09:21 +02:00
Alexandre Flament d068b67a71 Wikidata engine: minor change of the SPARQL request
The engine can be slow especially when the query won't return any answer.
See https://www.mediawiki.org/wiki/Wikidata_Query_Service/User_Manual/MWAPI#Find_articles_in_Wikipedia_speaking_about_cheese_and_see_which_Wikibase_items_they_correspond_to

Related to #1290
2022-06-11 10:50:11 +02:00
Markus Heiser 2de007138c [fix] prepare for pylint 2.14.0
Remove issue reported by Pylint 2.14.0:

- no-self-use: has been moved to optional extension [1]
- The refactoring checker now also raises 'consider-using-generator' messages
  for max(), min() and sum(). [2]

.pylintrc:
  - <option name>-hint has been removed since long, Pylint 2.14.0 raises an
    error on invalid options
  - bad-continuation and bad-whitespace have been removed [3]

[1] https://pylint.pycqa.org/en/latest/whatsnew/2/2.14/summary.html#removed-checkers
[2] https://pylint.pycqa.org/en/latest/whatsnew/2/2.14/full.html#what-s-new-in-pylint-2-14-0
[2] https://pylint.pycqa.org/en/latest/whatsnew/2/2.6/summary.html#summary-release-highlights

Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
2022-06-03 15:41:52 +02:00
Markus Heiser a967e59590 [pylint] searx/engines/wikidata.py (no functional change)
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
2022-02-07 10:15:32 +01:00
Léon Tiekötter 1c151ae92b
[fix] wikidata: URL decoding and file extension handling
Add '.png' to the second img_src_name if it has the extension '.svg'.
Use urllib.parse.unquote for URL decoding.
2022-02-07 00:21:02 +01:00
Markus Heiser a13c5d70c7 [fix] wikidata engine: select image with higher (not lower) priority
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
2022-02-06 23:35:55 +01:00
Léon Tiekötter a50f32bcfc
wikidata: load thumbnail instead of full image 2022-02-06 23:25:50 +01:00
Léon Tiekötter 560a14e77b
[fix] wikidata info box images
Wikidata info box images are now loaded from uploads.wikimedia.org instead of commons.wikimedia.org to prevent redirects

Co-authored-by: Markus Heiser <markus.heiser@darmarit.de>
2022-02-06 22:16:06 +01:00
Markus Heiser d84226bf63 [fix] issues reported by pylint
Fix pylint issues from commit (3d96a983)

    [format.python] initial formatting of the python code

Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
2021-12-27 10:16:20 +01:00
Markus Heiser 3d96a9839a [format.python] initial formatting of the python code
This patch was generated by black [1]::

    make format.python

[1] https://github.com/psf/black

Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
2021-12-27 09:26:22 +01:00
Markus Heiser fcdc2c2cd2 [format.python] disable py code formatting for some hunks of code
Disable the python code formatting from python-black, where the readability of
code suffers by formatting.

Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
2021-12-27 09:16:03 +01:00
Markus Heiser aecfb2300d [mod] one logger per engine - drop obsolete logger.getChild
Remove the no longer needed `logger = logger.getChild(...)` from engines.

Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
2021-09-06 18:05:46 +02:00
Alexandre Flament d14994dc73 [httpx] replace searx.poolrequests by searx.network
settings.yml:

* outgoing.networks:
   * can contains network definition
   * propertiers: enable_http, verify, http2, max_connections, max_keepalive_connections,
     keepalive_expiry, local_addresses, support_ipv4, support_ipv6, proxies, max_redirects, retries
   * retries: 0 by default, number of times searx retries to send the HTTP request (using different IP & proxy each time)
   * local_addresses can be "192.168.0.1/24" (it supports IPv6)
   * support_ipv4 & support_ipv6: both True by default
     see https://github.com/searx/searx/pull/1034
* each engine can define a "network" section:
   * either a full network description
   * either reference an existing network

* all HTTP requests of engine use the same HTTP configuration (it was not the case before, see proxy configuration in master)
2021-04-12 17:25:56 +02:00
Alexandre Flament a4dcfa025c [enh] engines: add about variable
move meta information from comment to the about variable
so the preferences, the documentation can show these information
2021-01-14 20:57:17 +01:00
Alexandre Flament d703119d3a [enh] add raise_for_httperror
check HTTP response:
* detect some comme CAPTCHA challenge (no solving). In this case the engine is suspended for long a time.
* otherwise raise HTTPError as before

the check is done in poolrequests.py (was before in search.py).

update qwant, wikipedia, wikidata to use raise_for_httperror instead of raise_for_status
2020-12-11 14:37:08 +01:00
Alexandre Flament bef2f2efa8 [fix] wikidata: fix crash when the item has no description at all and at least one URL. 2020-12-04 17:17:20 +01:00
Alexandre Flament 9ed3ee2beb [mod] wikidata: WDGeoAttribute class: doesn't change the method signature of get_str 2020-12-01 15:21:17 +01:00
Alexandre Flament 3038052c79 [mod] remove unused import
use
from searx.engines.duckduckgo import _fetch_supported_languages, supported_languages_url  # NOQA
so it is possible to easily remove all unused import using autoflake:
autoflake --in-place --recursive --remove-all-unused-imports searx tests
2020-11-14 14:11:02 +01:00
Alexandre Flament 95bd6033fa [mod] wikidata engine: use one SPARQL request instead of 2 HTTP requests. 2020-10-28 08:09:25 +01:00
Alexandre Flament 2006eb4680 [mod] move extract_text, extract_url to searx.utils 2020-10-02 18:13:56 +02:00
Dalf 1022228d95 Drop Python 2 (1/n): remove unicode string and url_utils 2020-09-10 10:39:04 +02:00
Marc Abonce Seguin 0d8970c8f2
only return one url per "type" in Wikidata (#2151)
i.e. only one official website, one Twitter, etc.
2020-08-27 21:44:48 +02:00
Adam Tauber 29960aa1d9 [enh] add official site link to the top of the infobox - closes #1644 2020-06-09 23:49:13 +02:00
Dalf 85b3723345 [mod] speed optimization
compile XPath only once
avoid redundant call to urlparse
get_locale(webapp.py): avoid useless call to request.accept_languages.best_match
2019-11-15 09:33:15 +01:00
Dalf 6e0285b2db [fix] wikidata engine: faster processing, remove one HTTP redirection.
* Search URL is https://www.wikidata.org/w/index.php?{query}&ns0=1 (with ns0=1 at the end to avoid an HTTP redirection)
* url_detail: remove the disabletidy=1 deprecated parameter
* Add eval_xpath function: compile once for all xpath.
* Add get_id_cache: retrieve all HTML with an id, avoid the slow to procress dynamic xpath '//div[@id="{propertyid}"]'.replace('{propertyid}')
* Create an etree.HTMLParser() instead of using the global one (see #1575)
2019-07-29 07:39:39 +02:00
Noémi Ványi b63d645a52 Revert "remove 'all' option from search languages"
This reverts commit 4d1770398a.
2019-01-07 21:19:00 +01:00
Marc Abonce Seguin 5568f24d6c [fix] check language aliases when setting search language 2019-01-06 20:31:57 -06:00
Léo Bourrel 7a474db61b Fix formatting 2018-07-06 10:31:01 +02:00
Léo Bourrel acaef6600e Update path to wikidata image 2018-07-05 10:11:45 +02:00
Marc Abonce Seguin b12857a70d [fix] make search requests on wikidata more accurate 2018-04-08 21:17:00 -05:00
Marc Abonce Seguin 772c048d01 refactor engine's search language handling
Add match_language function in utils to match any user given
language code with a list of engine's supported languages.

Also add language_aliases dict on each engine to translate
standard language codes into the custom codes used by the engine.
2018-03-27 00:08:03 -06:00
marc 4d1770398a remove 'all' option from search languages 2017-12-06 01:20:15 -06:00
Adam Tauber 52e615dede [enh] py3 compatibility 2017-05-15 12:02:30 +02:00
marc af35eee10b tests for _fetch_supported_languages in engines
and refactor method to make it testable without making requests
2016-12-15 00:40:21 -06:00
marc f62ce21f50 [mod] fetch supported languages for several engines
utils/fetch_languages.py gets languages supported by each engine and
generates engines_languages.json with each engine's supported language.
2016-12-13 19:58:10 -06:00
marc 149802c569 [enh] add supported_languages on engines and auto-generate languages.py 2016-12-13 19:32:00 -06:00
marc ad58b14be7 [fix] merge infoboxes based on weight
also minor changes in attributes and images from wikidata
2016-08-05 23:51:04 -05:00
marc a0a1284998 wikidata refactor and more attributes (see issue #560) 2016-08-05 23:51:04 -05:00
a01200356 93ef11adc0 [enh] multilingual wikidata
disambiguation and tags are in local language

TOFIX:
    needs to query the api every time to know each label's name
2016-08-05 23:51:04 -05:00
a01200356 8d335dbdae [enh] wikipedia infobox
creates simple multilingual infobox using wikipedia's api
2016-04-17 16:22:19 -05:00
Adam Tauber bd22e9a336 [fix] pep8 compatibilty 2016-01-18 12:47:31 +01:00