MySQLdb

来源:4-19 pipeline数据库保存

fengweihan

2020-06-26

你好老师:
我的mysqlclient包导入正确,编译也没有报错。但是运行时总是报没有MySQLdb的错。请问是怎么回事啊?非常感谢!!

此处没有报错:
此处没有报错

mysqlclient包也存在:
myclient包也存在

以下时我的pipeline.py中的代码:

# -*- coding: utf-8 -*-

from scrapy.pipelines.images import ImagesPipeline
import codecs
import json
from scrapy.exporters import JsonItemExporter
import MySQLdb

class ArticleSpiderPipeline:
def process_item(self, item, spider):
return item

class ArticleImagesPipeline(ImagesPipeline):
def item_completed(self, results, item, info):
if “front_image_url” in item:
for ok, value in results:
image_file_path = value[“path”]
item[“front_image_path”] = image_file_path

    return item

class JsonWithEncodingPipeline(object):
def init(self):
self.file = codecs.open(“article.json”, “w”, encoding=‘utf-8’)

def process_item(self, item, spider):
    lines = json.dumps(dict(item), ensure_ascii=False) + "\n"
    self.file.write(lines)
    return item

def spider_closed(self, spider):
    self.file.close()

class JsonExporterPipeline(object):
def init(self):
self.file = open(‘articleexport.json’, ‘wb’)
self.exporter = JsonItemExporter(self.file, encoding=‘utf-8’, ensure_ascii=False)
self.exporter.start_exporting()

def process_item(self, item, spider):
    self.exporter.export_item(item)
    return item

def spider_closed(self, spider):
    self.exporter.finish_exporting()
    self.file.close()

class MysqlPipeline(object):
def init(self):
self.conn = MySQLdb.connect(“127.0.0.1”, ‘root’, ‘knight213105’, ‘article_spider’, charset=‘utf8’, use_unicode=True)
self.cursor = self.conn.cursor()

def process_item(self, item, spider):
     insert_sql = """
        insert into jobbole_article(title, url, url_id, front_image_url, front_image_path, favNums, viewNum, commentNum, tags, content, create_date)
         values (%s, %s,%s, %s,%s, %s,%s, %s,%s, %s,%s)
     """
     params = list()
     params.append(item.get('title', ''))
     params.append(item.get('url', ''))
     params.append(item.get('url_id', ''))
     params.append(item.get('front_image_url', ''))
     params.append(item.get('front_image_path', ''))
     params.append(item.get('favNums', 0))
     params.append(item.get('viewNum', 0))
     params.append(item.get('commentNum', 0))
     params.append(item.get('tags', ''))
     params.append(item.get('content', ''))
     params.append(item.get(['create_date'], '1970-01-01'))
     self.cursor.execute(insert_sql, tuple(params))
     self.cursor.commit()
     return item
写回答

1回答

bobby

2020-06-27

看起来mysqldb没问题 你截图我看看具体的错误信息是什么 最好完整一点

0
0

Scrapy打造搜索引擎 畅销4年的Python分布式爬虫课

带你彻底掌握Scrapy,用Django+Elasticsearch搭建搜索引擎

5795 学习 · 6290 问题

查看课程