on some queries (like an IT error message), wikipedia returns an HTTP error 400.
this commit returns an empty result instead of showing an error to the user.
The duckduckgo engine requires an additional request after the results have been sent.
This commit makes sure that the second request uses the same HTTPAdapter
= the same IP address, and the same proxy.
The new version of MetaGer needs to reload the reults (into a iframe) with a
unique tag (see HTML response below).
Implementing a dedicated metager-engine for searx makes no sense to me. The
great days of MetaGer seems to be ended. I remember the good old days this
project started in the 90's of the last century. But in the last few years it
becomes more and more crap. As the name suggested, MetaGer was made for
germans in the first place. They have added a english and spain translation but
the i18n is very poor compared to what searx offers.
It's a pity, lets drop MetaGer.
This is the first response, the id (b82679980656899ba5a17ffd02a56846) is unique
for each query:
$ curl "https://metager.org/meta/meta.ger3?eingabe=foo&submit-query=&focus=web"
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<link rel="stylesheet" href="/index.css?id=b82679980656899ba5a17ffd02a56846">
<script src="/index.js?id=b82679980656899ba5a17ffd02a56846"></script>
<title>foo - MetaGer</title>
<meta name="viewport" content="width=device-width, initial-scale=1, maximum-scale=1" />
</head>
<body>
<iframe id="mg-framed" src="https://metager.org/meta/meta.ger3?eingabe=foo&submit-query=&focus=web&mgv=b82679980656899ba5a17ffd02a56846" autofocus="true" onload="this.contentWindow.focus();"></iframe>
</body>
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
Some of our interface locales include uppercase country codes,
which are separated by `_` instead of the more common `-`.
Also, a browser's `Accept-Language` header could be in lowercase.
This commit attempts to normalize those cases so a browser's
language+country codes can better match with our locales.
This solution assumes that our UI locales have nothing more than
language and optionally country. If we ever add a script specific
locale like `zh-Hant-TW` this would have to change to accomodate
that, but the idea would be pretty much the same as this fix.
Error:
Configuration error:
There is a programmable error in your configuration file:
...
NameError: name 'DOCS_URL' is not defined
make: *** [utils/makefile.sphinx:156: books/user.latex] Fehler 2
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
The language_support variable is set to True by default,
and set to False in only 5 engines.
Except the documentation and the /config URL, this variable is not used.
This commit remove the variable definition in the engines, and
set value according to supported_languages length: False when the length is 0,
True otherwise.
Close#2485
aka: ensure that 'make test' works as expected
The cache contains a copy './local' which is - under some circumstance -
corrupted. It is not possible to clear the cache [1] (see the top of the page).
Ensure that 'make test' works as expected [2] even if
- the python interpreter is missing
- the virtualenv exists but pyyaml is missing
To hardening when the workflow cache fails, this patch adds the new target
'travis.test' into the workflow. This target probes to import a python module
'yaml'. If this fails the virtualenv will be completely new build.
[1] https://github.com/actions/cache/issues/2#issuecomment-673493515
[2] https://github.com/searx/searx/pull/2517#discussion_r567240235
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
Target pip-exe is a prerequisite of the targets:
- pyinstall
- pyuninstall
and was accidentally deleted in commit 9b48ae47.
HINT:
do not confuse pyinstall with penvinstall
pyinstall & pyuninstall
Installing into user's HOME using pip from OS,
therefore the message is needed.
pyenvinstall & pyenvuninstall
Installing into virtualenv (./local) using pip which is provided by
prerequisite 'pyenv' in the virtualenv.
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
1. This patch fixes error:
rm -rf gh-pages/
make V=1 gh-pages
make[1]: Leaving directory '/800GBPCIex4/share/searx'
[ -d "gh-pages/.git" ] || git clone gh-pages
fatal: repository 'gh-pages' does not exist
2. The gh-page build has been moved to ./build/gh-pages this also affects
'travis-gh-pages'
3. The gh-pages commit messages now includes a ref to the repository and commit
4. Since a gh-pages history has only the drawback that the reposetory grows
fast, this patch also flattens the history:
cd build/gh-pages/; git log --oneline
bash: cd: build/gh-pages/: Datei oder Verzeichnis nicht gefunden
026126be (HEAD -> gh-pages, origin/gh-pages) make gh-pages: from https://github.com/return42/searx.git@71d66979c2935312e0aed7fc7c3cf6199fbe88a2
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
Avoid SearxEngineXPathException errors when parsing non valid results::
.//div[@class="yuRUbf"]//a/@href index 0 not found
Traceback (most recent call last):
File "./searx/engines/google.py", line 274, in response
url = eval_xpath_getindex(result, href_xpath, 0)
File "./searx/searx/utils.py", line 608, in eval_xpath_getindex
raise SearxEngineXPathException(xpath_spec, 'index ' + str(index) + ' not found')
searx.exceptions.SearxEngineXPathException: .//div[@class="yuRUbf"]//a/@href index 0 not found
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
BTW: fix indentation by 2 spaces
The additional tests has been commented out in the google engines to not release
any CAPTCHA issues.
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>