If test_engines_init.py runs before test_standalone_searx.py, the engine list is not empty.
It makes test_get_search_query flaky.
This commit initializes the engline list in test_standalone_searx.py
use
from searx.engines.duckduckgo import _fetch_supported_languages, supported_languages_url # NOQA
so it is possible to easily remove all unused import using autoflake:
autoflake --in-place --recursive --remove-all-unused-imports searx tests
* URL / : the index page displayed the selected or the default category.
* URL / : when the q parameter is set using the URL, the redirect includes the URL query.
* URL /search : an empty query doesn't raise an exception.
This makes it easier to separately handle search and index requests
from a web server or from a reverse proxy.
If a request to index contains a query, a permanent redirect HTTP response
is returned. This should give some level of backwards compatibility
for users that have set a searx instance in their browser's search bar.
Xpath engine and results template changed to account for the fact that
archive.org doesn't cache .onions, though some onion engines migth have
their own cache.
Disabled by default. Can be enabled by setting the SOCKS proxies to
wherever Tor is listening and setting using_tor_proxy as True.
Requires Tor and updating packages.
To avoid manually adding the timeout on each engine, you can set
extra_proxy_timeout to account for Tor's (or whatever proxy used) extra
time.
was previously a Dict with two or three keys: name, category, from_bang
make clear that this is a engine reference (see tests/unit/test_search.py for example)
all variables using this class are renamed accordingly.
A new "base" engine called command is introduced. It is the foundation for all command line engines for now.
You can use this engine to create your own command line engine.
Add some engines (commented out to make sure no one enables anything accidentally):
* git grep: This engine lets you grep in the searx repo.
* locate: If locate is installed and initialized, you can search on the FS.
* find: You can find files with a specific name from where you started searx.
* pattern search in files: This engine utilizes the command fgrep.
* regex search in files: This engine runs `grep` to find a file based on its contents.
* Made first attempt at the bangs redirects plugin.
* It redirects. But in a messy way via javascript.
* First version with custom plugin
* Added a help page and a operator to see all the bangs available.
* Changed to .format because of support
* Changed to .format because of support
* Removed : in params
* Fixed path to json file and changed bang operator
* Changed bang operator back to &
* Made first attempt at the bangs redirects plugin.
* It redirects. But in a messy way via javascript.
* First version with custom plugin
* Added a help page and a operator to see all the bangs available.
* Changed to .format because of support
* Changed to .format because of support
* Removed : in params
* Fixed path to json file and changed bang operator
* Changed bang operator back to &
* Refactored getting search query. Also changed bang operator to ! and is now working.
* Removed prints
* Removed temporary bangs_redirect.js file. Updated plugin documentation
* Added unit test for the bangs plugin
* Fixed a unit test and added 2 more for bangs plugin
* Changed back to default settings.yml
* Added myself to AUTHORS.rst
* Refacored working of custom plugin.
* Refactored _get_bangs_data from list to dict to improve search speed.
* Decoupled bangs plugin from webserver with redirect_url
* Refactored bangs unit tests
* Fixed unit test bangs. Removed dubbel parsing in bangs.py
* Removed a dumb print statement
* Refactored bangs plugin to core engine.
* Removed bangs plugin.
* Refactored external bangs unit tests from plugin to core.
* Removed custom_results/bangs documentation from plugins.rst
* Added newline in settings.yml so the PR stays clean.
* Changed searx/plugins/__init__.py back to the old file
* Removed newline search.py
* Refactored get_external_bang_operator from utils to external_bang.py
* Removed unnecessary import form test_plugins.py
* Removed _parseExternalBang and _isExternalBang from query.py
* Removed get_external_bang_operator since it was not necessary
* Simplified external_bang.py
* Simplified external_bang.py
* Moved external_bangs unit tests to test_webapp.py. Fixed return in search with external_bang
* Refactored query parsing to unicode to support python2
* Refactored query parsing to unicode to support python2
* Refactored bangs plugin to core engine.
* Refactored search parameter to search_query in external_bang.py
commit 2c6531b2 does not only break the unit test, it is a significant change of
the data model and the searx search-syntax model (UI) without any discussion nor
documentation.
At the end, adding routes to instant answers is a nice feature but commit
2c6531b2 leaf some questions open.
In that sense, this patch is only a hotfix not a assessment.
Signed-off-by: Markus Heiser <markus.heiser@darmarit.de>
This PR fixes the result count from bing which was throwing an (hidden) error and add a validation to avoid reading more results than avalaible.
For example :
If there is 100 results from some search and we try to get results from 120 to 130, Bing will send back the results from 0 to 10 and no error. If we compare results count with the first parameter of the request we can avoid this "invalid" results.
The new url parameter "timeout_limit" set timeout limit defined in second.
Example "timeout_limit=1.5" means the timeout limit is 1.5 seconds.
In addition, the query can start with <[number] to set the timeout limit.
For number between 0 and 99, the unit is the second :
Example: "<30 searx" means the timeout limit is 3 seconds
For number above 100, the unit is the millisecond:
Example: "<850 searx" means the timeout is 850 milliseconds.
In addition, there is a new optional setting: outgoing.max_request_timeout.
If not set, the user timeout can't go above searx configuration (as before: the max timeout of selected engine for a query).
If the value is set, the user can set a timeout between 0 and max_request_timeout using
<[number] or timeout_limit query parameter.
Related to #1077
Updated version of PR #1413 from @isj-privacore