【问题标题】:Scraping data in element using beautifulSoup使用 beautifulSoup 抓取元素中的数据
【发布时间】:2020-03-01 14:56:16
【问题描述】:

我正在使用 beautifulSoup 进行网页抓取。我设法刮掉了名字,但问题是,如果数据在元素中,我真的不确定如何刮取,例如电话号码和电子邮件的位置如下图所示:

我的代码:

 import requests
    from bs4 import BeautifulSoup

    raw = requests.get('https://www.iproperty.com.my/property/findanagent.aspx?ty=as&ak=&rk=&pg=1&rmp=10&st=KL&ct=&st1=&ct1=#40091').text
    raw = raw.replace("</br>", "")

    soup = BeautifulSoup(raw, 'html.parser')

    import re

phone = ['data-content'])[0][1:][:-1] for d in soup.find_all('a',{'class':'csagentphonelead'})]
    name = [x.text.strip().split("\r\n")[-1].strip() for x in soup.find_all("p", class_='box-listing_agentCS')]
    website = [x.text.strip().split("\r\n")[-1].strip() for x in soup.find_all("a", class_='csagentemaillead')] 

    num_page_items = len(name)
    with open('results180.csv', 'a') as f:
        for i in range(num_page_items):
        f.write(name[i] + "," + phone[i] + "," + website[i] + "," + "\n")

我的抓取结果是“点击查看电子邮件”和“点击查看电话”。 我应该如何解决,以便结果是正确的电子邮件和电话号码?

【问题讨论】:

    标签: python beautifulsoup scrape


    【解决方案1】:

    您必须从链接中获取数据属性的值。你可以试试这个代码 -

    import requests
    from bs4 import BeautifulSoup
    
    raw = requests.get('https://www.iproperty.com.my/property/findanagent.aspx?ty=as&ak=&rk=&pg=1&rmp=10&st=KL&ct=&st1=&ct1=#40091').text
    raw = raw.replace("</br>", "")
    
    soup = BeautifulSoup(raw, 'html.parser')
    
    import re
    #['data-content'])[0][1:][:-1] ## note sure what is this
    # for d in soup.find_all('a',{'class':'csagentphonelead'}):
    name = [x.text.strip().split("\r\n")[-1].strip() for x in soup.find_all("p", class_='box-listing_agentCS')]
    phone = [x['data'].strip().split("\r\n")[-1].strip() for x in soup.find_all("a", class_='csagentphonelead')] 
    website = [x['data'].strip().split("\r\n")[-1].strip() for x in soup.find_all("a", class_='csagentemaillead')] 
    
    num_page_items = len(name)
    with open('results180.csv', 'a') as f:
        for i in range(num_page_items):
            f.write(name[i] + "," + phone[i] + "," + website[i] + "," + "\n")
    

    【讨论】:

      猜你喜欢
      • 1970-01-01
      • 2016-04-05
      • 1970-01-01
      • 2023-03-09
      • 1970-01-01
      • 1970-01-01
      • 1970-01-01
      • 1970-01-01
      相关资源
      最近更新 更多