Insert Scraped Table Data Directly Into Postgresql Db
I want to insert my scraped data directly into PostgreSQL db, I'm struggling with writing query for this, any help would be appreciated. The code I've came up with so far: import
Solution 1:
I assume that cols
contains 5 elements, in order which you presented at your table, otherwise adjust indexes.
import csv
import urllib.request
from bs4 import BeautifulSoup
conn = psycopg2.connect(database='--', user='--', password='--', port='--')
cursor = conn.cursor()
soup = BeautifulSoup(urllib.request.urlopen(
"http://tis.nhai.gov.in/TollInformation?TollPlazaID=236").read(), 'lxml')
tbody = soup('table', {"class": "tollinfotbl"})[0].find_all('tr')
for row in tbody:
cols = row.findChildren(recursive=False)
cols = [ele.text.strip() for ele in cols]
if cols:
vehicle_type = cols[0]
one_time = int(cols[1])
return_strip = int(cols[2])
monthly_pass = int(cols[3])
local_vehicle = int(cols[4])
query = "INSERT INTO table_name (vehicle_type, return_strip, monthly_pass, local_vehicle) VALUES (%s, %s, %s, %s, %s);"
data = (vehicle_type, one_time, return_strip, monthly_pass, local_vehicle)
cursor.execute(query, data)
# Commit the transaction
conn.commit()
Post a Comment for "Insert Scraped Table Data Directly Into Postgresql Db"