66

I am getting an url with:

r = requests.get("http://myserver.com")

As I can see in the 'access.log' of "myserver.com", the client's system proxy is used. But I want to disable using proxies at all with requests.

philshem
  • 23,689
  • 7
  • 58
  • 120
t777
  • 2,790
  • 7
  • 34
  • 47

6 Answers6

132

The only way I'm currently aware of for disabling proxies entirely is the following:

  • Create a session
  • Set session.trust_env to False
  • Create your request using that session
import requests

session = requests.Session()
session.trust_env = False

response = session.get('http://www.stackoverflow.com')

This is based on this comment by Lukasa and the (limited) documentation for requests.Session.trust_env.

Note: Setting trust_env to False also ignores the following:

  • Authentication information from .netrc (code)
  • CA bundles defined in REQUESTS_CA_BUNDLE or CURL_CA_BUNDLE (code)

If however you only want to disable proxies for a particular domain (like localhost), you can use the NO_PROXY environment variable:

import os
import requests

os.environ['NO_PROXY'] = 'stackoverflow.com'

response = requests.get('http://www.stackoverflow.com')
Lukas Graf
  • 26,817
  • 8
  • 73
  • 87
  • Has `trust_env=False` any other (side-)effects than disabling the proxy? – t777 Feb 15 '15 at 00:45
  • 7
    Actually, on one of my servers, `no_proxy` is the correct answer (lower case). – boh717 Nov 17 '15 at 13:33
  • another trick is to let urllib.getproxies return a not empty dict (urllib.getproxies=lambda: {'z':'z'}). then requests will not get proxy setting from the env and os settings. – cfy Nov 30 '15 at 02:33
  • 2
    os.environ['NO_PROXY'] = os.environ['NO_PROXY'] + '\,'+ 'stackoverflow.com' So as to not to replace your default proxies, appending to list will do – LeDerp Jan 29 '19 at 15:44
  • `trust_env = False` solution worked perfectly, thanks for the solution! – Mishal Shah Oct 22 '20 at 10:02
  • `NO_PROXY` must be `no_proxy` – AstraSerg Feb 17 '21 at 14:42
  • `os.environ['NO_PROXY'] = 'stackoverflow.com'` worked perfectly, thanks a lot! – Steven Lee Aug 25 '21 at 16:48
  • The environment variables, especially NO_PROXY are a convention rather than a standard. In addition - as some commenters complained that it is 'no_proxy' and not 'NO_PROXY' - in Windows environment variables are case-insensitive, but in Unix systems (MacOS and Linux) they are! The problem in Mac or Unix then is - if someone working at a machine thinks that it is either 'NO_PROXY' or 'no_proxy' and another user assumes the wrong case, bugs are predestined. Thus the `proxy=` argument method is cleaner and always correct. – Gwang-Jin Kim Apr 13 '22 at 03:10
75

You can choose proxies for each request. From the docs:

import requests

proxies = {
  "http": "http://10.10.1.10:3128",
  "https": "http://10.10.1.10:1080",
}

requests.get("http://example.org", proxies=proxies)

So to disable the proxy, just set each one to None:

import requests

proxies = {
  "http": None,
  "https": None,
}

requests.get("http://example.org", proxies=proxies)
jtpereyda
  • 6,239
  • 8
  • 48
  • 75
9

The way to stop requests/urllib from proxying any requests is to set the the no_proxy (or NO_PROXY) environment variable to * e.g. in bash:

export no_proxy='*'

Or from Python:

import os
os.environ['no_proxy'] = '*' 

To understand why this works is because the urllib.request.getproxies function first checks for any proxies set in the environment variables (e.g. http_proxy, HTTP_PROXY, https_proxy, HTTPS_PROXY, etc) or if none are set then it will check for system configured proxies using platform specific calls (e.g. On MacOS it will check using the system scutil/configd interfaces, and on Windows it will check the Registry). As mentioned in the comments if any proxy variables are set you can reset them as @udani suggested, or unset them like this from Python:

del os.environ['HTTP_PROXY']

Then when urllib attempts to use any proxies the proxyHandler function it will check for the presence and setting of the no_proxy environment variable - which can either be set to specific hostnames as mentioned above or it can be set the special * value whereby all hosts bypass the proxy.

Pierz
  • 5,588
  • 37
  • 55
  • Using `*` didn't work for me. However, I only had a few exceptions, so listing them explicitly worked great: `export no_proxy='site1.com, site2.domain.com, site3.gov.com'` – Jack G May 24 '21 at 08:07
  • In my case I had a 'HTTP_PROXY' field in the environment variables. Setting the value as os.environ['HTTP_PROXY'] = '-' fixed it. – udani Feb 09 '22 at 19:40
4

requests library respects environment variables. http://docs.python-requests.org/en/latest/user/advanced/#proxies

So try deleting environment variables HTTP_PROXY and HTTPS_PROXY.

import os
for k in list(os.environ.keys()):
    if k.lower().endswith('_proxy'):
        del os.environ[k]
Denilson Sá Maia
  • 44,404
  • 32
  • 102
  • 111
KostasT
  • 207
  • 1
  • 3
2

With Python3, jtpereyda's solution didn't work, but the following did:

proxies = {
    "http": "",
    "https": "",
}
0
 r = requests.post('https://localhost:44336/api/',data='',verify=False)

I faced the same issue when connecting with localhost to access my .net backend from a Python script with the request module.

I set verify to False, which cancels the default SSL verification.

P.s - above code will throw a warning that can be neglected by below one

import urllib3
urllib3.disable_warnings(urllib3.exceptions.InsecureRequestWarning)
r=requests.post('https://localhost:44336/api/',data='',verify=False)
Adriaan
  • 17,081
  • 7
  • 36
  • 71