Python requests module crashes/returns nothing


We are attempting to create a small server side python script to perform some html scraping of another website. The problem is that the script returns nothing after the ‘requests.get(…)’ statement. To test this, we are using the test_rest.html file, and the output is always blank. We installed bunch, requests, bs4 (beautiful soup), and we even used the example ‘math’ script to see if everything was set up correctly, and that did return a correct answer.

from bs4 import BeautifulSoup
import requests

verb = event.request.method;

if verb != ‘GET’:
raise Exception(‘Only HTTP GET is allowed on this endpoint.’);

resource = event.resource;

params = event.request.parameters;

required = [‘temp’];

if resource != “”:
for element in required:
if params.get(element, “”) == “”:
raise Exception(‘Missing ’ + element + ’ in params.’);

isbn = str(params.temp);

if resource != ‘isbn’:
result = {‘resource’:[‘isbn’]};
result = []
r = requests.get(‘’ + isbn)

soup = BeautifulSoup(r.text)
table = soup.find("table", { "class" : "table table-striped table-hover table-condensed" })
rows = table.find_all("table", {"class" : "table table-condensed"})
for row in rows:
    seller = row.find('span')['title'].split()[0]
    condition = row.find('td', {"class" : "condition"}).contents[0].split()[0]
    price = row.find('td', {"class" : "total"}).contents[0].split()[0]
    dic = {'seller':[],'condition':[],'price':[]}

return result


It’s a big pain to debug the scripts. Normally what I do is pass data as a string to raise so that you can see what is happening in the http error. Otherwise you need to put DF in debug mode and call print(). Then check the log file.

Unfortunately if the script has an error it basically just gets skipped and you have no idea what happened.

You could look here for some hints perhaps: