I am trying to scrap data from a url using beautifulsoup. Below is my code
import requests URL = "https://bigdataldn.com/speakers/" page = requests.get(URL) print(page.text)
However I am getting the following error when I run the code in google colab.
SSLCertVerificationError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1091) During handling of the above exception, another exception occurred: MaxRetryError Traceback (most recent call last) MaxRetryError: HTTPSConnectionPool(host='bigdataldn.com', port=443): Max retries exceeded with url: / (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1091)')))
The above code works fine for other urls. Can someone help me figure out how to solve this issue.
It’s not your fault – their certificate chain is not properly configured. What you can do is disabling the certificate verification (you should not do this when you’re handling sensitive information!) but it might be fine for a webscraper.
page = requests.get(URL, verify=False)