- fix issue reported #1809
- filter out `None` value from issn and isbn list
- add comments (from publicationName)
- add publisher
Closes: https://github.com/searxng/searxng/issues/1809
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
Some result items from core.ac.uk do not have an URL::
Traceback (most recent call last):
File "searx/search/processors/online.py", line 154, in search
search_results = self._search_basic(query, params)
File "searx/search/processors/online.py", line 142, in _search_basic
return self.engine.response(response)
File "SearXNG/searx/engines/core.py", line 73, in response
'url': source['urls'][0].replace('http://', 'https://', 1),
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
Currentty, when oa_doi_rewrite find a DOI in the result URL, it replace the URL.
In this commit, the plugin adds the key "doi" to the result,
so the paper.html can show it.
* Move the datetime to str code from searx.webapp.search to searx.webutils.searxng_format_date
* When the month, day, hour, day and second are zero, the function returns only the year.
The word "hackable" may arouse interest in programmers to participate in the
development, but it scares the ordinary user.
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
The google news are in a rework, the content area of a news item has been
removed.
Closes: https://github.com/searxng/searxng/issues/1790
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
Fix::
searx/locales.py:docstring of searx.locales.get_engine_locale:17: \
WARNING: Definition list ends without a blank line; unexpected unindent.
Improvement: don't show default values in the generated documentation whe it is
more a mess than a usefull information (`:meta hide-value:`).
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
no_result_for_http_status contains a list of HTTP status.
These HTTP status are seen an empty result list.
In other cases an exception is thrown as usual.
Previously raise_for_httperror were ignoring all HTTP error,
which make defective engines invisible in the stats.
With the POST method, autocomplete.js does not URL encode the values.
For example "1+1" is sent as "1+1" which is read as "1 1" since space are URL encoded with a plus.
There is no clean way to fix the bug since autocomplete.js seems abandoned.
The commit monkey patches the ajax function of autocomplete.js
Related to #1695
Only raise "suspicious Accept-Encoding" when both "gzip" and "deflate" are missing from Accept-Encoding.
Prevent Browsers which only implement one compression solution from being blocked by the limiter plugin.
Example Browser which is currently blocked: Lynx Browser (https://lynx.invisible-island.net)
When a user selects an unknown or invalid locale by using the search syntax:
!qw siemens :de-TW
Before this patch a UnknownLocaleError exception will be rasied:
```
Traceback (most recent call last):
File "SearXNG/searx/search/processors/online.py", line 154, in search
search_results = self._search_basic(query, params)
File "SearXNG/searx/search/processors/online.py", line 128, in _search_basic
self.engine.request(query, params)
File "SearXNG/searx/engines/qwant.py", line 98, in request
q_locale = get_engine_locale(params['language'], supported_languages, default='en_US')
File "SearXNG/searx/locales.py", line 216, in get_engine_locale
locale = babel.Locale.parse(searxng_locale, sep='-')
File "SearXNG/local/py3/lib/python3.8/site-packages/babel/core.py", line 330, in parse
raise UnknownLocaleError(input_id)
```
This patch implements a simple exception handling, since e.g. `de-TW` does not
exists `de` will be used to get engines locale. On invalid terms like `xy-XY`
the default will be returned.
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
The request function should not request a language (aka locale) that is not
supported by qwant. Select a locale like zh-TW ends in qwant's API error:
ERROR searx.engines.qwant news: exception : \
API error::locale must be one of the following values: \
en_gb, en_ie, en_us, en_ca, en_my, en_au, en_nz, de_de, de_ch, de_at, fr_fr, \
fr_be, fr_ch, fr_ca, fr_ad, fc_ca, co_fr, es_es, es_ar, es_cl, es_co, es_mx, \
es_pe, es_ad, ca_es, ca_ad, ca_fr, eu_es, eu_fr, it_it, it_ch, pt_pt, pt_ad, \
nl_be, nl_nl
The existing searx.utils.match_language function is unsuitable for this purpose,
it is replaced by function searx.locales.get_engine_locale that is based on the
methods from the babel package.
The quant's _fetch_supported_languages function has been revised to filter out
languages 8aka locales) not supported by qwant.
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
The match_language function sometimes returns incorrect results which is why a
new function get_engine_locale is required.
A bugfix of the match_language is not easily possible, because there is almost
no documentation for it and already the call parameters are undefined. E.g. the
function processes values like the ones from yahoo::
"yahoo": [
"ar",
...
"zh_chs",
"zh_cht"
]
The get_engine_locale has been documented in detail, there is a clear
description of the assumptions as well as the requirements and approximation
rules (read doc-string for more details)::
Argument ``engine_locales`` is a python dict that maps *SearXNG locales* to
corresponding *engine locales*:
<engine>: {
# SearXNG string : engine-string
'ca-ES' : 'ca_ES',
'fr-BE' : 'fr_BE',
'fr-CA' : 'fr_CA',
'fr-CH' : 'fr_CH',
'fr' : 'fr_FR',
...
'pl-PL' : 'pl_PL',
'pt-PT' : 'pt_PT'
}
.. hint::
The *SearXNG locale* string has to be known by babel!
In the following you will find a comparison:
>>> import babel.languages
>>> from searx.utils import match_language
>>> from searx.locales import get_engine_locale
Assume we have an engine that supports the follwoing locales:
>>> lang_list = {
... "zh-CN": "zh_CN",
... "zh-HK": "zh_HK",
... "nl-BE": "nl_BE",
... "fr-CA": "fr_CA",
... }
Assumption:
A. When a user selects a language the results should be optimized according to
the selected language.
B. When user selects a language and a territory the results should be
optimized with first priority on territory and second on language.
----
Example: (Assumption A.)
A user selects region 'zh-TW' which should end in zh_HK
hint:
CN is 'Hans' and HK ('Hant') fits better to TW ('Hant')
>>> get_engine_locale('zh-TW', lang_list)
'zh_HK'
>>> lang_list[match_language('zh-TW', lang_list)]
'zh_CN'
----
Example: (Assumption A.)
A user selects only the language 'zh' which should end in CN
>>> get_engine_locale('zh', lang_list)
'zh_CN'
>>> lang_list[match_language('zh', lang_list)]
'zh_CN'
----
Example: (Assumption B.)
A user selects region 'fr-BE' which should end in nl-BE
hint:
priority should be on the territory the user selected. If the user
prefers 'fr' he will select 'fr' without a region tag.
>>> get_engine_locale('fr-BE', lang_list, default='unknown')
'nl_BE'
>>> match_language('fr-BE', lang_list, fallback='unknown')
'fr-CA'
----
Example: (Assumption A.)
A user selects only the language 'fr' which should end in fr_CA
>>> get_engine_locale('fr', lang_list)
'fr_CA'
>>> lang_list[match_language('fr', lang_list)]
'fr_CA'
----
The difference in priority on the territory is best shown with a engine that
supports the following locales:
>>> lang_list = {
... "fr-FR": "fr_FR",
... "fr-CA": "fr_CA",
... "en-GB": "en_GB",
... "nl-BE": "nl_BE",
... }
----
Example: (Assumption A.)
A user selects only a language
>>> get_engine_locale('en', lang_list)
'en_GB'
>>> match_language('en', lang_list)
'en-GB'
hint: the engine supports fr_FR and fr_CA since no territory is given, fr_FR
takes priority ..
>>> get_engine_locale('fr', lang_list)
'fr_FR'
>>> lang_list[match_language('fr', lang_list)]
'fr_FR'
----
Example: (Assumption B.)
A user selects region 'fr-BE' which should end in nl-BE
>>> get_engine_locale('fr-BE', lang_list)
'nl_BE'
>>> lang_list[match_language('fr-BE', lang_list)]
'fr_FR'
----
If the user selects a language and there are two locales like the following:
>>> lang_list = {
... "fr-BE": "fr_BE",
... "fr-CH": "fr_CH",
... }
>>>
>>> get_engine_locale('fr', lang_list)
'fr_BE'
>>> lang_list[match_language('fr', lang_list)]
'fr_BE'
Looks like both functions return the same value, but match_language depends on the
order of the dictionary (which is not predictable):
>>> lang_list = {
... "fr-CH": "fr_CH",
... "fr-BE": "fr_BE",
... }
>>> get_engine_locale('fr', lang_list)
'fr_BE'
>>> lang_list[match_language('fr', lang_list)]
'fr_CH'
>>>
The get_engine_locale selects the locale by looking at the "population percent"
and this percentage has an higher amount in BE (68.%) compared to CH (21%)
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
By using new property `qwant_categ:` the category of qwant is no longer bound to
the category of SearXNG.
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
Neeva is "the world's first ad-free, private search engine" and uses data from Apple, Bing, Yelp and "others".
They claim to crawl "hundreds of millions" of URLs a day (https://twitter.com/Neeva/status/1536447373903335426).
Some HTTP-Clients do have issues with the ``opensearch.xml`` from SearXNG
(related [1][2]) while other OpenSearch descriptions[3] (e.g. from qwant) work
flawles.
Inspired by the OpenSearch description from qwant and with informations from the
specification[4] the ``opensearch.xml`` has been *improved*.
- convert `<Url>` methods from lower case to upper case (`POST`|`GET`)
- add `<moz:SearchForm>` and `xmlns:moz="http://www.mozilla.org/2006/browser/search/"`
- add `<Query role="example" searchTerms="SearXNG" />` [4]
OpenSearch description documents should include at least one Query element of
`role="example"` that is expected to return search results. Search clients may
use this example query to validate that the search engine is working properly.
- modified `<LongName>` to SearXNG
- modified `<Description>` the word 'hackable' scares uninitiated users and was removed
- add the `type="image/png"` to `<Image>`
Test can be done by::
make run
Visit http://127.0.0.1:8888/ and add the search engine to your WEB-Browser /
test with different WEB-Browser from desktop and Smartphones (are there any iOS
user here, please test on Safari and Chrome).
[1] https://app.element.io/#/room/#searxng:matrix.org/$xN_abdKhNqUlgXRBrb_9F3pqOxnSzGQ1TG0s0G9hQVw
[2] https://github.com/searxng/searxng/issues/431
[3] https://developer.mozilla.org/en-US/docs/Web/OpenSearch
[4] https://github.com/dewitt/opensearch/blob/master/opensearch-1-1-draft-6.md#the-query-element
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
This implements the Deepl Translation engine. It works nearly like lingva but
directly to the deepl API. This api only needs a to-lang, from-lang is a fake
by now.
There is a free option to use [1].
[1] https://www.deepl.com/pro-api?cta=header-pro-api for registering a free account.
yep.com is still in beta, the api.yep.com does not have paging support. There
is only a 'limit' argument with a maximum of 100 results.
yep.com seems fast; there is nor need for a timeout of 12 sec.
The API returns JSON nevertheless what the HTTP header is, the "show more"
button on yep.com's web site does not set a special HTTP Accept header.
FYI: The index does not support languages, the WEB UI does not offer a language
selection of the results and the entire index seems in English.
Closes: https://github.com/searxng/searxng/issues/1619
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>