How can i load Netscape cookie from a file to auth to a website REST API with python requests[session], pycurl or something else ? Similar to curl -b ${home}.cookie
-b, --cookie (HTTP) Pass the data to the HTTP server as a cookie. It is supposedly the data previously received from the server in a "Set-Cookie:" line. The data should be in the format "NAME1=VALUE1; NAME2=VALUE2".
import requests
from requests.auth import HTTPDigestAuth
proxi = {'http': 'http://proxy',
'https': 'http://proxy'}
url = 'http://192.196.1.98:8080/a/changes/?q=status:new'
r = requests.get(url, proxies=proxi) #cookies=cookie
print r.status_code
print r.json()
print r.headers
print r.request.headers
print r.text
The Python request library quick start guide has a section on cookies. It details a class for holding cookies called RequestsCookieJar
, which you need to load the cookies into by parsing the Netscape cookie file manually.
import requests
# Open cookie file
cookies_txt = open(r"cookies.txt")
# Initialize RequestsCookieJar
jar = requests.cookies.RequestsCookieJar()
for line in cookies_txt.readlines():
words = line.split()
# Filter out lines that don't contain cookies
if (len(words) == 7) and (words[0] != "#"):
# Split cookies into the appropriate parameters
jar.set(words[5], words[6], domain=words[0], path=words[2])
# Make the request
url = "https://website.com"
website = requests.get(url, proxy=proxi, cookies=jar)