0

I'm using scrapy to scrape sites that require login, but I'm no sure exactly wich are the fields requires to save and load in order to keep the session.

With selenium I'm doing the following to sae the cookies:

import pickle
import selenium.webdriver 

driver = selenium.webdriver.Firefox()
driver.get("http://www.google.com")
pickle.dump( driver.get_cookies() , open("cookies.pkl","wb"))

And this to load them:

import pickle import selenium.webdriver

driver = selenium.webdriver.Firefox()
driver.get("http://www.google.com")
cookies = pickle.load(open("cookies.pkl", "rb"))
for cookie in cookies:
    driver.add_cookie(cookie)

And it works just fine, is it possible to do exactly this using scrapy

1 Answers1

1

Sent a request using cookies:

request_with_cookies = Request(url="http://www.example.com", cookies={'currency': 'USD', 'country': 'UY'})

Get cookies from response:

cookies_from_response = response.headers[b'Cookies'].decode()
0x01h
  • 737
  • 5
  • 11