I'm trying to access to a webpage. I tried 'UserAgent' to add headers, however, got timeout error: my new codes:
from fake_useragent import UserAgent
import requests
url = "https://www.bestbuy.com/site/lg-65-class-oled-b9-series-2160p-smart-4k-uhd-tv-with-hdr/6360611.p?skuId=6360611"
ua = UserAgent()
print(ua.chrome)
header = {'User-Agent':str(ua.chrome)}
print(header)
url_get = requests.get(url, headers=header)
print(url_get)
--> 285 raise SocketError(str(e)) 286 except OpenSSL.SSL.ZeroReturnError as e:
OSError: (60, 'ETIMEDOUT')
During handling of the above exception, another exception occurred:
ProtocolError Traceback (most recent call last) /anaconda3/lib/python3.6/site-packages/requests/adapters.py in send(self, request, stream, timeout, verify, cert, proxies) 439 retries=self.max_retries, --> 440 timeout=timeout 441 )
--> 285 raise SocketError(str(e)) 286 except OpenSSL.SSL.ZeroReturnError as e:
ProtocolError: ('Connection aborted.', OSError("(60, 'ETIMEDOUT')",))
During handling of the above exception, another exception occurred:
You dont to need use fake_useragent , just try some like this ...pass agent cookies variable to request
import requests
url = "https://www.bestbuy.com/site/lg-65-class-oled-b9-series-2160p-smart-4k-uhd-tv-with-hdr/6360611.p?skuId=6360611"
agent = {"User-Agent":'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/78.0.3904.87 Safari/537.36'}
cookies = {"cookie":"COPY_HERE_YOUR_COOKIE_FROM_BROWSER"}
url_get = requests.get(url,headers=agent, cookies=cookies)
print(url_get.text)
If you dont know how get the cookies , just click right in your browser (Chrome example) -> Inspect > Network ...and when you load the web look the first request and look the headers. This code work for me.