Skip to content

Instantly share code, notes, and snippets.

@nblackburn87
Created April 17, 2015 23:07
Show Gist options
  • Save nblackburn87/19f1255f3627f7134b4c to your computer and use it in GitHub Desktop.
Save nblackburn87/19f1255f3627f7134b4c to your computer and use it in GitHub Desktop.
Python Scraper
from bs4 import BeautifulSoup
from urllib3 import PoolManager
from time import sleep
BASE_URL = 'http://www.chicagoreader.com'
def make_soup(url):
html = urlopen(section_url).read()
return BeautifulSoup(html, 'lxml')
def get_category_links(section_url):
soup = make_soup(url)
boccat = coup.find('dl', 'boccat')
category_links = [BASE_URL + dd.a['href'] for dd in boccat.findALL('dd')]
return category_links
def get_category_winner(category_url):
soup = make_soup(url)
category = soup.find('h1', 'headline').string
winner = [h2.string for h2 in soup.findALL('h2', 'boc1')]
runners_up = [h2.string for h2 in soup.findALL('h2', 'boc2')]
return {
'category': category,
'category_url': category_url,
'winner': winner,
'runners_up': runners_up,
}
if __name__ == '__main__':
food_n_drink = ('http://chicagoreader.com/chicago'
'best-of-chicago-2011-food-drink/BestOf?oid=4106228')
categories = get_category_links(food_n_drink)
data = []
for category in categories:
winner = get_category_winner(category)
data.append(winner)
sleep(1)
print data
@akf
Copy link

akf commented Apr 17, 2015

So, these need to match up:

def make_soup(url):
html = urlopen(section_url).read()
return BeautifulSoup(html, 'lxml')

There's a similar issue in the other two routines. I see what you're getting at about needing to grab two different URLs. I would be inclined to change all of these to "url" but, you could also change the parameter (e.g., change "def make_soup(url)" to "def make_soup(section_url)").

@nblackburn87
Copy link
Author

So I could then make different soup like:

def make_other_soup(category_url):

And use them appropriately? Obviously not dry anymore but programmatically correct at least?

@akf
Copy link

akf commented Apr 17, 2015

It's important to keep in mind that using "url" in multiple functions to mean different URLs is fine ... as long as it isn't a global variable. (Globals should be used sparingly.)

@nblackburn87
Copy link
Author

Sure. I'm not solid enough with my Python to know how to define it otherwise.

@akf
Copy link

akf commented Apr 17, 2015

I'm not sure that's making sense to me. Of course I'm making some assumptions about what you're trying to do, but I don't think you need a make_other_soup unless you want it to do a different thing than make_soup. If I have:

def make_soup(yooarrell):
[stuff]

and I call:
foo = make_soup(section_url);
bar = make_soup(category_url);

I expect to get different things back for foo and bar.

@nblackburn87
Copy link
Author

Ahh. That makes sense I think. Lemme see what I can get going. Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment