国产探花免费观看_亚洲丰满少妇自慰呻吟_97日韩有码在线_资源在线日韩欧美_一区二区精品毛片,辰东完美世界有声小说,欢乐颂第一季,yy玄幻小说排行榜完本

首頁 > 編程 > Python > 正文

Pyspider中給爬蟲偽造隨機請求頭的實例

2020-01-04 15:10:39
字體:
來源:轉載
供稿:網友

Pyspider 中采用了 tornado 庫來做 http 請求,在請求過程中可以添加各種參數,例如請求鏈接超時時間,請求傳輸數據超時時間,請求頭等等,但是根據pyspider的原始框架,給爬蟲添加參數只能通過 crawl_config這個Python字典來完成(如下所示),框架代碼將這個字典中的參數轉換成 task 數據,進行http請求。這個參數的缺點是不方便給每一次請求做隨機請求頭。

crawl_config = {"user_agent": "Mozilla/5.0 (Windows NT 6.3; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/52.0.2743.116 Safari/537.36","timeout": 120,"connect_timeout": 60,"retries": 5,"fetch_type": 'js',"auto_recrawl": True,}

這里寫出給爬蟲添加隨機請求頭的方法:

1、編寫腳本,將腳本放置在 pyspider 的 libs 文件夾下,命名為 header_switch.py

#!/usr/bin/env python# -*- coding:utf-8 -*-# Created on 2017-10-18 11:52:26import randomimport timeclass HeadersSelector(object):  """  Header 中缺少幾個字段 Host 和 Cookie  """  headers_1 = {    "Proxy-Connection": "keep-alive",    "Pragma": "no-cache",    "Cache-Control": "no-cache",    "User-Agent": "Mozilla/5.0 (Windows NT 6.3; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/52.0.2743.116 Safari/537.36",    "Accept": "text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8",    "DNT": "1",    "Accept-Encoding": "gzip, deflate, sdch",    "Accept-Language": "zh-CN,zh;q=0.8,en-US;q=0.6,en;q=0.4",    "Referer": "https://www.baidu.com/s?wd=%BC%96%E7%A0%81&rsv_spt=1&rsv_iqid=0x9fcbc99a0000b5d7&issp=1&f=8&rsv_bp=1&rsv_idx=2&ie=utf-8&rqlang=cn&tn=baiduhome_pg&rsv_enter=0&oq=If-None-Match&inputT=7282&rsv_t",    "Accept-Charset": "gb2312,gbk;q=0.7,utf-8;q=0.7,*;q=0.7",  } # 網上找的瀏覽器  headers_2 = {    "Proxy-Connection": "keep-alive",    "Pragma": "no-cache",    "Cache-Control": "no-cache",    "User-Agent": "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/49.0.2623.221 Safari/537.36 SE 2.X MetaSr 1.0",    "Accept": "image/gif,image/x-xbitmap,image/jpeg,application/x-shockwave-flash,application/vnd.ms-excel,application/vnd.ms-powerpoint,application/msword,*/*",    "DNT": "1",    "Referer": "https://www.baidu.com/link?url=c-FMHf06-ZPhoRM4tWduhraKXhnSm_RzjXZ-ZTFnPAvZN",    "Accept-Encoding": "gzip, deflate, sdch",    "Accept-Language": "zh-CN,zh;q=0.8,en-US;q=0.6,en;q=0.4",  } # window 7 系統瀏覽器  headers_3 = {    "Proxy-Connection": "keep-alive",    "Pragma": "no-cache",    "Cache-Control": "no-cache",    "User-Agent": "Mozilla/5.0 (X11; Linux x86_64; rv:52.0) Gecko/20100101 Firefox/52.0",    "Accept": "image/x-xbitmap,image/jpeg,application/x-shockwave-flash,application/vnd.ms-excel,application/vnd.ms-powerpoint,application/msword,*/*",    "DNT": "1",    "Referer": "https://www.baidu.com/s?wd=http%B4%20Pragma&rsf=1&rsp=4&f=1&oq=Pragma&tn=baiduhome_pg&ie=utf-8&usm=3&rsv_idx=2&rsv_pq=e9bd5e5000010",    "Accept-Encoding": "gzip, deflate, sdch",    "Accept-Language": "zh-CN,zh;q=0.8,en-US;q=0.7,en;q=0.6",  } # Linux 系統 firefox 瀏覽器  headers_4 = {    "Proxy-Connection": "keep-alive",    "Pragma": "no-cache",    "Cache-Control": "no-cache",    "User-Agent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:55.0) Gecko/20100101 Firefox/55.0",    "Accept": "*/*",    "DNT": "1",    "Referer": "https://www.baidu.com/link?url=c-FMHf06-ZPhoRM4tWduhraKXhnSm_RzjXZ-ZTFnP",    "Accept-Encoding": "gzip, deflate, sdch",    "Accept-Language": "zh-CN,zh;q=0.9,en-US;q=0.7,en;q=0.6",  } # Win10 系統 firefox 瀏覽器  headers_5 = {    "Connection": "keep-alive",    "Pragma": "no-cache",    "Cache-Control": "no-cache",    "User-Agent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64;) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/52.0.2743.116 Safari/537.36 Edge/15.15063",    "Accept": "text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8",    "Referer": "https://www.baidu.com/link?url=c-FMHf06-ZPhoRM4tWduhraKXhnSm_RzjXZ-",    "Accept-Encoding": "gzip, deflate, sdch",    "Accept-Language": "zh-CN,zh;q=0.9,en-US;q=0.7,en;q=0.6",    "Accept-Charset": "gb2312,gbk;q=0.7,utf-8;q=0.7,*;q=0.7",  } # Win10 系統 Chrome 瀏覽器  headers_6 = {    "Accept": "text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8",    "Accept-Encoding": "gzip, deflate, sdch",    "Accept-Language": "zh-CN,zh;q=0.8",    "Pragma": "no-cache",    "Cache-Control": "no-cache",    "Connection": "keep-alive",    "DNT": "1",    "Referer": "https://www.baidu.com/s?wd=If-None-Match&rsv_spt=1&rsv_iqid=0x9fcbc99a0000b5d7&issp=1&f=8&rsv_bp=1&rsv_idx=2&ie=utf-8&rq",    "Accept-Charset": "gb2312,gbk;q=0.7,utf-8;q=0.7,*;q=0.7",    "User-Agent": "Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/49.0.2623.221 Safari/537.36 SE 2.X MetaSr 1.0",  } # win10 系統瀏覽器  def __init__(self):    pass  def select_header(self):    n = random.randint(1, 6)    switch={    1: self.headers_1    2: self.headers_2    3: self.headers_3    4: self.headers_4    5: self.headers_5    6: self.headers_6    }    headers = switch[n]    return headers

其中,我只寫了6個請求頭,如果爬蟲的量非常大,完全可以寫更多的請求頭,甚至上百個,然后將 random的隨機范圍擴大,進行選擇。

2、在pyspider 腳本中編寫如下代碼:

#!/usr/bin/env python# -*- encoding: utf-8 -*-# Created on 2017-08-18 11:52:26from pyspider.libs.base_handler import *from pyspider.addings.headers_switch import HeadersSelectorimport sysdefaultencoding = 'utf-8'if sys.getdefaultencoding() != defaultencoding:  reload(sys)  sys.setdefaultencoding(defaultencoding)class Handler(BaseHandler):  crawl_config = {    "user_agent": "Mozilla/5.0 (Windows NT 6.3; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/52.0.2743.116 Safari/537.36",    "timeout": 120,    "connect_timeout": 60,    "retries": 5,    "fetch_type": 'js',    "auto_recrawl": True,  }  @every(minutes=24 * 60)  def on_start(self):    header_slt = HeadersSelector()    header = header_slt.select_header() # 獲取一個新的 header    # header["X-Requested-With"] = "XMLHttpRequest"    orig_href = 'http://sww.bjxch.gov.cn/gggs.html'    self.crawl(orig_href,          callback=self.index_page,          headers=header) # 請求頭必須寫在 crawl 里,cookies 從 response.cookies 中找  @config(age=24 * 60 * 60)  def index_page(self, response):    header_slt = HeadersSelector()    header = header_slt.select_header() # 獲取一個新的 header    # header["X-Requested-With"] = "XMLHttpRequest"    if response.cookies:      header["Cookies"] = response.cookies

其中最重要的就是在每個回調函數 on_start,index_page 等等 當中,每次調用時,都會實例化一個 header 選擇器,給每一次請求添加不一樣的 header。要注意添加的如下代碼:

    header_slt = HeadersSelector()    header = header_slt.select_header() # 獲取一個新的 header    # header["X-Requested-With"] = "XMLHttpRequest"    header["Host"] = "www.baidu.com"    if response.cookies:      header["Cookies"] = response.cookies

當使用 XHR 發送 AJAX 請求時會帶上 Header,常被用來判斷是不是 Ajax 請求, headers 要添加 {‘X-Requested-With': ‘XMLHttpRequest'} 才能抓取到內容。

確定了 url 也就確定了請求頭中的 Host,需要按需添加,urlparse包里給出了根據 url解析出 host的方法函數,直接調用netloc即可。

如果響應中有 cookie,就需要將 cookie 添加到請求頭中。

如果還有別的偽裝需求,自行添加。

如此即可實現隨機請求頭,完。

以上這篇Pyspider中給爬蟲偽造隨機請求頭的實例就是小編分享給大家的全部內容了,希望能給大家一個參考,也希望大家多多支持VEVB武林網。


注:相關教程知識閱讀請移步到python教程頻道。
發表評論 共有條評論
用戶名: 密碼:
驗證碼: 匿名發表
主站蜘蛛池模板: 盱眙县| 丽江市| 汶上县| 永安市| 玉山县| 拉萨市| 胶南市| 河曲县| 五家渠市| 循化| 云阳县| 沈丘县| 三门县| 吉木乃县| 南康市| 芜湖县| 正安县| 房山区| 天长市| 建阳市| 龙陵县| 航空| 阿拉善盟| 洞口县| 监利县| 长垣县| 萨嘎县| 昂仁县| 政和县| 天镇县| 饶阳县| 皋兰县| 广汉市| 大兴区| 新沂市| 南澳县| 潢川县| 天门市| 大方县| 灵石县| 天长市|