Skip to content

A django app which automatically scrapes a job posting website for latest jobs and provides you with a list of jobs everyday.

License

Notifications You must be signed in to change notification settings

Rohan-cod/sch_webscr

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Job Postings App

A django app which automatically scrapes a job posting website for latest jobs and provides you with a list of jobs everyday. Website for latest job postings is Monster.

Highlights

  • User Authentication
  • Automatic web scraping for latest job postings using Beautiful Soup and Heroku Scheduler.
  • Change Password functionality
  • Real time search
  • Back to top button
  • Pagination(5 Job postings per page)
  • User friendly interface

Requirements

Python 3.7
Django 2.2.8
And additional requirements are in Pipfile.

Setting up the Project

  • Download and install Python 3.7
  • Download and install Git.
  • Fork the Repository.
  • Clone the repository to your local machine $ git clone https://github.com/<your-github-username>/sch_webscr.git
  • Change directory to sch_webscr $ cd sch_webscr
  • Install pipenv $ pip3 install pipenv
  • Create a virtual environment and install all requirements from Pipfile $ pipenv install
  • Activate the env: $ pipenv shell
  • Make migrations $ python manage.py makemigrations
  • Migrate the changes to the database $ python manage.py migrate
  • Create superuser $ python manage.py createsuperuser
  • Run the server $ python manage.py runserver

Deployment

This app has been deployed in production using heroku. Here's a list of steps to be followed for deploying an app to heroku:

  • Run pipenv lock to generate the appropriate Pipfile.lock $ pipenv lock
  • Then create a Procfile which tells Heroku how to run the remote server where our code will live. $ touch Procfile
  • For now we’re telling Heroku to use gunicorn as our production server and look in our mb_project.wsgi file for further instructions. Update Procfile with - web: gunicorn <project_name>.wsgi --log-file -
  • Next install gunicorn which we’ll use in production while still using Django’s internal server for local development use. $ pipenv install gunicorn==19.9.0
  • Finally update ALLOWED_HOSTS with '*' in settings.py file.
  • push the updates to the GitHub repository.
  • Login to heroku. $ heroku login
  • Create a new heroku app. $ heroku create <app_name>
  • Set git to use the name of your new app when you push to Heroku. $ heroku git:remote -a <app_name>
  • If there are no static files run $ heroku config:set DISABLE_COLLECTSTATIC=1
  • Push the code to Heroku. $ git push heroku master
  • Add free scaling so the app is actually running online. $ heroku ps:scale web=1

Contributing

Feel free to raise a issue or make a pull request to fix a bug or add a new feature. If you are new to open source you can first read about git by clicking here.

About

A django app which automatically scrapes a job posting website for latest jobs and provides you with a list of jobs everyday.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published