pythonpython-requestsinterception

how to get specific cookies from requests when the library doesn't handle them for you


So I have this page I need to make login in constantly, but I can't even make login in the page if I don't use some cookies in the headers.

The problem is if I use the cookies postman gives me, they will eventually expire and I have to replace all of them again in the code.

    #####this makes the login

url = "https://www.APAGE.com"#login url

payload = "user="+dude["user"]+"&password="+dude["password"]+"&action=login"
headers = {
'User-Agent': "Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:61.0) Gecko/20100101 Firefox/61.0", #for some reason i need this or the page returns error
'Content-Type': "application/x-www-form-urlencoded",
'X-Requested-With': "XMLHttpRequest",
'Cookie': "ASP.NET_SessionId=<an usually expirable cookie>; __RequestVerificationToken_L0NvbnN1bHRhV2Vi0=<another expirable cookie>",#i need THESE!
'Cache-Control': "no-cache",
}


login = session.post(url, data=payload, headers=headers)# makes the login
print "Login open"
cookie = session.cookies.get_dict() #get the recursive cookie

             #this here is me trying to grab the request-cookies just after the login so i can repass them so they don't expire

print '================'
print login.request.headers
print '================'
print '\n\n\n'
cookie2 = login.headers.get('Set-Cookie')
print login.headers
print cookie2
print login.cookies.get_dict()


#makes a get request to change to the initial page

url = "www.APAGE-after-login.com"

headers = {
'User-Agent': "Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:61.0) Gecko/20100101 Firefox/61.0",
'Referer': "www.APAGE-after-login.com",
'Cookie': "ASP.NET_SessionId=<the cookie again>.; __RequestVerificationToken_L0NvbnN1bHRhV2Vi0=<the other cookie>; .ASPXAUTH="+str(cookie['.ASPXAUTH']), #here i need to repost the .ASPAUTH cookie every time after a request or the session expires
'Upgrade-Insecure-Requests': "1",
'Cache-Control': "no-cache",
}


moving = session.get(url,headers=headers)

cookie = session.cookies.get_dict()

I need help here to get those cookies so, when they change, I don't have to change entire sections of the code again and again. Does anyone know how I can intercept those request-cookies so I can use them? thanks!

Edit: i already have the session = requests.session() declared in the code and i've already tried several solutions to solve the problem... the code works if i manually place the cookie on the headers but the cookie will expire in a couple days... The requests library for some reason is not handling the cookies automatically...

if i use this header:

    headers = {
'User-Agent': "Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:61.0) Gecko/20100101 Firefox/61.0",
'Cookie': .ASPXAUTH="+str(cookie['.ASPXAUTH']),
'Cache-Control': "no-cache",
}

or any other variation as

moving = session.get(url,headers=headers,cookies=cookie) #cookie that i tried to get before

the login simply dont work. it returns an error page.

Thanks

Edit2:

for customer in customers:
session = requests.session()



##create a folder
if not os.path.exists("C:\\Users\\Desktop\\customers\\" + customer["dir"] + "/page"):
    os.makedirs("C:\\Users\\Desktop\\customers\\" + customer["dir"] + "/page", 0755)

search_date= datetime.datetime.now().strftime("%d-%m-%Y-%H-%M-%S")
search_date_end=  (datetime.datetime.now() - timedelta(days = 30)).strftime("%d/%m/%Y") 
search_date_begining= (datetime.datetime.now() - timedelta(days = 30)).strftime("%d/%m/%Y")


search_date_closing= (datetime.datetime.now() - timedelta(days = 45)).strftime("%d/%m/%Y") 
search_date_closing= urllib.quote_plus(data_busca_fechamento)
search_date_begining= urllib.quote_plus(data_busca_inicio)
search_date_end= urllib.quote_plus(data_busca_fim)


print str(search_date_end)


######makes the login

url = "www.ASITE.com/aunthenticate/APAGELogin" #login
payload = "user="+customer["user"]+"&password="+customer["pass"]+"&action=login"
headers = {
'User-Agent': "Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:61.0) Gecko/20100101 Firefox/61.0", #for some reason i need this or the login breaks 
'Content-Type': "application/x-www-form-urlencoded",
'X-Requested-With': "XMLHttpRequest",
'Cookie': "ASP.NET_SessionId=<some cookie>;  __RequestVerificationToken_L0NvbnN1bHRhV2Vi0=<part1cookie>-<part2cookie>-<part3cookie>", #i need to get these cookies to login, for some reason i cant get them by any means
'Cache-Control': "no-cache",
}



login = session.post(url, data=payload, headers=headers)#open the login session on the page
print "Login session open"
cookie = session.cookies.get_dict() #when i get this cookie i only get the recursive cookie '.ASPXAUTH' that i need to get again every request or the session expires
print login.text
#The response has only one line with some site data confirming the login
#if the login fails it returns an HTML with the error message


#here i try to get the request cookies and not the response ones, but the headers dont return any cookies at all
print '================'
print login.request.headers
print '================'
print '\n\n\n'
cookie2 = login.headers.get('Set-Cookie')
print login.headers
print cookie2
print login.cookies.get_dict() #this cookie is returned, but just the '.ASPXAUTH' one, the one i already know how to get





#makes the get request to the index page
url = "www.ASITE/index/home"

headers = {
'Accept': "text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8",
'User-Agent': "Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:61.0) Gecko/20100101 Firefox/61.0", #again i need to pass the user-agent
'Accept-Language': "pt-BR,pt;q=0.8,en-US;q=0.5,en;q=0.3",
'Cookie': "ASP.NET_SessionId=<a cookie>; __RequestVerificationToken_L0NvbnN1bHRhV2Vi0=<other long cookie>; .ASPXAUTH="+str(cookie['.ASPXAUTH']), #here i need to start passing the recursive cookie again and again every request on the site
'Upgrade-Insecure-Requests': "1",
'Cache-Control': "no-cache",
}


moving = session.get(url,headers=headers)

cookie = session.cookies.get_dict() #get the '.ASPXAUTH' again

The problem here is that if i manually set the missing cookies the code will work for a couple days, but when they expire or if another machine uses the code i have to set them again manually. in this way i tried several things to get those 2 other cookies before the requests, none actually worked, and, for some reason, the 'requests' library is not handling them automatically as it should... I honestly dont know what to do anymore.


Solution

  • The code started working. The good news is the code now is getting the cookies the right way; the bad news is I have absolutely no idea how that happened.

    The only thing I added was this piece of code: (the catch is I added it yesterday and it didn't work then... now it works).

    headers = {
    'User-Agent': "Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:61.0) Gecko/20100101 Firefox/61.0",
    'Content-Type': "application/x-www-form-urlencoded",
    'X-Requested-With': "XMLHttpRequest",
    'Cache-Control': "no-cache",
    }
    
    
    
    url = "www.asite.com/login" #login page
    
    login = session.get(url, headers=headers) #login get
    
    
    
    
    print 'login.request.headers================'
    print login.request.headers
    print '================'
    print '\n\n\n'
    cookie2 = login.headers.get('Set-Cookie')
    print 'login headers ============================='
    print login.headers
    headers = login.headers
    print '\n\n\n'
    print 'login.headers.get(''Set-Cookie'') ================================'
    print cookie2
    print '\n\n\n'
    print "login.cookies.get_dict() ========================="
    test = login.cookies.get_dict()
    print test
    print '\n\n\n'
    

    yesterday the login.cookies.get_dict() just returned empty dict or none or, if placed after the login, returned only the recursive cookie... now... it is working.