GPU Stock Scraper is a script to scrape various Canadian computer part supplier websites and determine if stock exists for a given item.
At the time of this script, RTX 3080 are being scanned across:
- Newegg.ca
- Checks online stock
- Bestbuy.ca
- Differentiates online vs backorder vs nearby-store stock
- Memoryexpress.com
- Differentiates online vs in-store stock
- Allows selection of specific stores
- Canadacomputers.com
- Differentiates online vs in-store stock
- Allows selection of specific stores
- Amazon.ca
- Checks online stock
- PC-canada.com
- Checks online stock
- Download appropriate chromedriver to the
scraping/
folder pip3 install -r requirements.txt
- NOTE: This method is fairly insecure and Google will ask you to allow insecure apps to use this method
- Go to the Less secure app access section of your Google Account. You might need to sign in.
- Turn Allow less secure apps on.
- Create a .env file, using .env_sample as a guide, and input email information
- Acquire your webhook URL from your Discord server for your selected channel
- See here for an explanation: https://support.discord.com/hc/en-us/articles/228383668-Intro-to-Webhooks
- Assign your webhook URL to DISCORD_WEBHOOK in your .env file
- Set discord_message_enabled to True
python3 main.py
The docker image contains the chromedriver and python3, if you already have docker, you're good to go.
- Build the image:
docker build -t gpu-stock-scraper .
- Run the image:
docker run --rm -t gpu-stock-scraper
- If you'd like email, set these environment variables
docker run -e EMAIL=${EMAIL} -e PASSWORD=${PASSWORD} -e RECIPIENT1=${RECIPIENT1} -e RECIPIENT2=${RECIPIENT2} --rm -t gpu-stock-scraper
- If you'd like to change the base interval frequency, add the INTERVAL environment variable
docker run -e INTERVAL=60 --rm -t gpu-stock-scraper
- Modify stores_to_check in scrape_canada_computers() and scrape_memory_express() to reflect your local stores
- If you are receiving an error installing dotenv, try "pip3 install python-dotenv"
- Build a config file
- Decouple email, discord, and beep functions from the scraping_functions.py file into their own
- Add model-specific filtering (meanwhile, filter via website then update URL)
- Refine search_best_buy() to only return matches for select stores
- Use a more secure method (potentially oauth) for sending emails