【问题标题】:Import scraped data from website directly into PostgreSQL将网站抓取的数据直接导入 PostgreSQL
【发布时间】:2017-09-02 09:26:17
【问题描述】:

我想将网络抓取的数据直接导入 PostgreSQL,而不是将其导出为 .csv。

这是我正在使用的代码,用于将数据导出到 .csv 文件,然后我手动导入它。任何帮助将不胜感激

from urllib.request import urlopen as uReq
from bs4 import BeautifulSoup as soup
my_url = 'http://tis.nhai.gov.in/TollInformation?TollPlazaID=236'
uClient = uReq(my_url)
page1_html = uClient.read()
uClient.close()
#html parsing
page1_soup = soup(page1_html,"html.parser")

filename = "TollDetail12.csv"
f = open(filename,"w")
headers = "ID, tollname, location, highwayNumber\n"
f.write(headers)

#grabing data
containers = page1_soup.findAll("div",{"class":"PA15"})
for container in containers:
    toll_name = container.p.b.text

    search1 = container.findAll('b')
    highway_number = search1[1].text

    location = list(container.p.descendants)[10]
    ID = my_url[my_url.find("?"):]
    mystr = ID.strip("?")
    print("ID: " + mystr)
    print("toll_name: " + toll_name)
    print("location: " + location)
    print("highway_number: " + highway_number)
        

    f.write(mystr + "," + toll_name + "," + location + "," + highway_number.replace(",","|") + "\n")
f.close()

【问题讨论】:

标签: python postgresql csv psycopg2


【解决方案1】:

您需要安装psycopg2 pip 包。除此之外,使用您的项目特定信息编辑文件,尚未测试但应该可以工作。

from urllib.request import urlopen as uReq

from bs4 import BeautifulSoup as soup

import psycopg2

my_url = 'http://tis.nhai.gov.in/TollInformation?TollPlazaID=236'
uClient = uReq(my_url)
page1_html = uClient.read()
uClient.close()
# html parsing
page1_soup = soup(page1_html, 'html.parser')

# grabing data
containers = page1_soup.findAll('div', {'class': 'PA15'})

# Make the connection to PostgreSQL
conn = psycopg2.connect(database='database_name',
                        user='user_name', password='user_password', port=5432)
cursor = conn.cursor()
for container in containers:
    toll_name = container.p.b.text

    search1 = container.findAll('b')
    highway_number = search1[1].text

    location = list(container.p.descendants)[10]
    ID = my_url[my_url.find('?'):]
    mystr = ID.strip('?')

    query = "INSERT INTO table_name (ID, toll_name, location, highway_number) VALUES (%s, %s, %s, %s);"
    data = (ID, toll_name, location, highway_number)

    cursor.execute(query, data)

# Commit the transaction
conn.commit()

【讨论】:

  • 我在运行代码File "C:\Users\prash\AppData\Local\Programs\Python\Python36-32\lib\site-packages\psycopg2\__init__.py", line 130, in connect conn = _connect(dsn, connection_factory=connection_factory, **kwasync) psycopg2.OperationalError: FATAL: role "prashant" is not permitted to log in时遇到此错误@
  • 您需要更改具有登录权限的角色。可以通过以下命令完成:ALTER ROLE "prashant" WITH LOGIN;
  • 嗨,你能检查一下吗。 stackoverflow.com/questions/46025873/…
  • 嗨,你能帮我找到这个解决方案吗stackoverflow.com/questions/46052939/…
猜你喜欢
  • 1970-01-01
  • 1970-01-01
  • 1970-01-01
  • 2021-03-26
  • 2014-07-06
  • 1970-01-01
  • 1970-01-01
相关资源
最近更新 更多