国产探花免费观看_亚洲丰满少妇自慰呻吟_97日韩有码在线_资源在线日韩欧美_一区二区精品毛片,辰东完美世界有声小说,欢乐颂第一季,yy玄幻小说排行榜完本

首頁 > 編程 > Python > 正文

講解Python的Scrapy爬蟲框架使用代理進行采集的方法

2020-01-04 17:44:54
字體:
來源:轉載
供稿:網友
這篇文章主要介紹了講解Python的Scrapy爬蟲框架使用代理進行采集的方法,并介紹了隨機使用預先設好的user-agent來進行爬取的用法,需要的朋友可以參考下
 

1.在Scrapy工程下新建“middlewares.py”

# Importing base64 library because we'll need it ONLY in case if the proxy we are going to use requires authenticationimport base64# Start your middleware classclass ProxyMiddleware(object): # overwrite process request def process_request(self, request, spider):  # Set the location of the proxy  request.meta['proxy'] = "http://YOUR_PROXY_IP:PORT"  # Use the following lines if your proxy requires authentication  proxy_user_pass = "USERNAME:PASSWORD"  # setup basic authentication for the proxy  encoded_user_pass = base64.encodestring(proxy_user_pass)  request.headers['Proxy-Authorization'] = 'Basic ' + encoded_user_pass

2.在項目配置文件里(./project_name/settings.py)添加

DOWNLOADER_MIDDLEWARES = { 'scrapy.contrib.downloadermiddleware.httpproxy.HttpProxyMiddleware': 110, 'project_name.middlewares.ProxyMiddleware': 100,}

只要兩步,現在請求就是通過代理的了。測試一下^_^

from scrapy.spider import BaseSpiderfrom scrapy.contrib.spiders import CrawlSpider, Rulefrom scrapy.http import Requestclass TestSpider(CrawlSpider): name = "test" domain_name = "whatismyip.com" # The following url is subject to change, you can get the last updated one from here : # http://www.whatismyip.com/faq/automation.asp start_urls = ["http://xujian.info"] def parse(self, response):  open('test.html', 'wb').write(response.body)

3.使用隨機user-agent

默認情況下scrapy采集時只能使用一種user-agent,這樣容易被網站屏蔽,下面的代碼可以從預先定義的user- agent的列表中隨機選擇一個來采集不同的頁面

在settings.py中添加以下代碼

DOWNLOADER_MIDDLEWARES = {  'scrapy.contrib.downloadermiddleware.useragent.UserAgentMiddleware' : None,  'Crawler.comm.rotate_useragent.RotateUserAgentMiddleware' :400 }

注意: Crawler; 是你項目的名字 ,通過它是一個目錄的名稱 下面是蜘蛛的代碼

#!/usr/bin/python#-*-coding:utf-8-*-import randomfrom scrapy.contrib.downloadermiddleware.useragent import UserAgentMiddlewareclass RotateUserAgentMiddleware(UserAgentMiddleware): def __init__(self, user_agent=''):  self.user_agent = user_agent def process_request(self, request, spider):  #這句話用于隨機選擇user-agent  ua = random.choice(self.user_agent_list)  if ua:   request.headers.setdefault('User-Agent', ua) #the default user_agent_list composes chrome,I E,firefox,Mozilla,opera,netscape #for more user agent strings,you can find it in http://www.useragentstring.com/pages/useragentstring.php user_agent_list = [/  "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.1 (KHTML, like Gecko) Chrome/22.0.1207.1 Safari/537.1"/  "Mozilla/5.0 (X11; CrOS i686 2268.111.0) AppleWebKit/536.11 (KHTML, like Gecko) Chrome/20.0.1132.57 Safari/536.11",/  "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/536.6 (KHTML, like Gecko) Chrome/20.0.1092.0 Safari/536.6",/  "Mozilla/5.0 (Windows NT 6.2) AppleWebKit/536.6 (KHTML, like Gecko) Chrome/20.0.1090.0 Safari/536.6",/  "Mozilla/5.0 (Windows NT 6.2; WOW64) AppleWebKit/537.1 (KHTML, like Gecko) Chrome/19.77.34.5 Safari/537.1",/  "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/536.5 (KHTML, like Gecko) Chrome/19.0.1084.9 Safari/536.5",/  "Mozilla/5.0 (Windows NT 6.0) AppleWebKit/536.5 (KHTML, like Gecko) Chrome/19.0.1084.36 Safari/536.5",/  "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/536.3 (KHTML, like Gecko) Chrome/19.0.1063.0 Safari/536.3",/  "Mozilla/5.0 (Windows NT 5.1) AppleWebKit/536.3 (KHTML, like Gecko) Chrome/19.0.1063.0 Safari/536.3",/  "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_8_0) AppleWebKit/536.3 (KHTML, like Gecko) Chrome/19.0.1063.0 Safari/536.3",/  "Mozilla/5.0 (Windows NT 6.2) AppleWebKit/536.3 (KHTML, like Gecko) Chrome/19.0.1062.0 Safari/536.3",/  "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/536.3 (KHTML, like Gecko) Chrome/19.0.1062.0 Safari/536.3",/  "Mozilla/5.0 (Windows NT 6.2) AppleWebKit/536.3 (KHTML, like Gecko) Chrome/19.0.1061.1 Safari/536.3",/  "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/536.3 (KHTML, like Gecko) Chrome/19.0.1061.1 Safari/536.3",/  "Mozilla/5.0 (Windows NT 6.1) AppleWebKit/536.3 (KHTML, like Gecko) Chrome/19.0.1061.1 Safari/536.3",/  "Mozilla/5.0 (Windows NT 6.2) AppleWebKit/536.3 (KHTML, like Gecko) Chrome/19.0.1061.0 Safari/536.3",/  "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/535.24 (KHTML, like Gecko) Chrome/19.0.1055.1 Safari/535.24",/  "Mozilla/5.0 (Windows NT 6.2; WOW64) AppleWebKit/535.24 (KHTML, like Gecko) Chrome/19.0.1055.1 Safari/535.24"  ]

發表評論 共有條評論
用戶名: 密碼:
驗證碼: 匿名發表
主站蜘蛛池模板: 邯郸县| 房产| 吉首市| 宜都市| 麻城市| 柳州市| 平果县| 宝坻区| 射阳县| 黑水县| 澄迈县| 珠海市| 镇原县| 集贤县| 南澳县| 宁武县| 宽城| 合山市| 英超| 乐安县| 海伦市| 荥经县| 通道| 探索| 安仁县| 弥勒县| 昭通市| 太保市| 甘肃省| 长岭县| 乐平市| 洛南县| 吴江市| 武定县| 东乌珠穆沁旗| 本溪| 鄯善县| 临泽县| 科技| 广水市| 天津市|