Showcase

Job Scraper

Built to gather public job listings and manage them all in one robust dashboard

During my job hunting journey, the part that takes the longest is searching through all the job listings on LinkedIn. Even after applying filters and location, sponsored postings are always listed at the top whether it is what im searching for or not. Combine that with the endless fake jobs posted by recruiting agencies and pretty soon im spending a few hours searching for jobs but only a few minutes actually applying for the ones that I want. I came up with the idea to scrape public job listings (without needing a login) from LinkedIn and make a useful dashboard to manage them so I could maximize my time applying for jobs that I actually wanted.

  • Python data gathering This is my first time using Python in a full stack project! I used it to gather job listings and populate my database.
  • SupaBase back-end I chose to use SupaBase as my back-end database. I really liked how simple it was to use for a personal project!
  • Robust dashboard I used NextJS 15 and TypeScript to built a simple yet powerful dashboard for managing all of my gathered job listings.

Overview

Searching for jobs on LinkedIn takes way too long. Even after applying filters and sorting you have to wade through endless sponsored postings and fake listings posted by recruitment agencies or companies that just want your data for "future opportunities". Not to mention the endless jobs that are posted under entry-level but then list four years of experience as a requirement in the description!

I decided to build a full stack application to find jobs for me based on my own preferences, and populate a database that I could access anywhere from a responsive, powerful dashboard. By building my own solution I could ensure that the job listings I look at are all relevent to my search while cutting down the time it took to apply drastically.

Python data gathering

During my time as a student at Utah Valley University I wrote many projects using JavaScript and TypeScript but I didn't have much experience with Python outside of a fundamentals class I took. I knew that Python is one of the most popular programming languages and has many real-world applications, so I decided to dive deeper into the language and incorporate it into my skillset as a developer. I was always interested in using a RaspberryPi to build and host my own apps using python so I purchased one with the intended use as my personal server.

I quickly fell in love with Python for it's simple syntax and intuitive formula for writing applications. For this project, I decided to use a library called BeautifulSoup to scrape public listings from LinkedIn where I could then filter and compile them into a usable form for my database. With this tool I was able to use a set of urls to find listings based on my own preferences then sort through the html to find the relevent data that I needed. After that, I just connected my SupaBase database and populated it with the compiled data for consumption later on the front-end. I scheduled the Python script to be ran daily on my RaspberryPi using the built in cron job features from Linux.

BeautifulSoup provides easy access to data from gathered html

SupaBase database

I chose to use SupaBase for my database because of it's fast set up and ease of use for personal projects. I was able to make a free SupaBase project, set up authentication, and begin populating data in only a few hours. Alternatively if I chose to use a different solution and write my own authentication it would have doubled the hours I spent on the project! This was a personal project so using SupaBase was the perfect solution for me.

Simple and intuitive auth with SupaBase

SupaBase provides powerful tools for managing your data that save hours of development time for things like personal projects. You can do things like set custom auth policies based on user groups and they even provide custom API docs for your tables so that you can start manipulating data right away. They also provide documentation for integrating your database with many popular frameworks like NextJS so I was able to dedicate my time to actually building my app!

Table editor with many useful and simple tools for your data
Manipulating data with SupaBase is fast and easy

Dashboard

I built the dashboard to manage my gathered job listings using NextJS 15, TypeScript, and Tailwind. This is a stack that I am already comfortable in from building school and personal projects. I like NextJS because it adds so much more on top of the React like SSR, file-based routing, and image optimization. TypeScript makes your code production ready and type safe, and Tailwind makes editing the look and feel of the site much faster and more streamlined.

My original goal for the app was to save time applying for jobs so the dashboard had to be simple, but have powerful features for sorting and analyzing job listings. After logging in to the app the user is taken to the main dashboard where they can see jobs right away. The user can then sort jobs based on if they applied and filter based on apply date, date the job was added, location, and if the jobs are remote or local. I also added a search bar to search filtered jobs based on job title, company, or location.

Powerful filtering and sorting tools

I implemented several UX features to make the dashboard more responsive and intuitive for the user. The user can switch between light and dark mode for the entire app at the click of a button! Colored tags show at a glance which jobs are applied or active, and alerts popup to give users feedback on actions they take whether it was a success or a failure. The site it fully responsive for any device on desktop, tablet, or mobile.

Alerts give the user feedback on actions they take

What I learned

After completing this project I am much more comfortable building full-stack applications. I can take an idea or problem I have and convert it to a fully functional application that I can access anywhere and provide real value! I learned about user authentication and roles, manipulating data in a real database, and building an experience that is useful and powerful for accomplishing real work. After building this my confidence in taking an idea all the way to a complete app is much greater. I can use these skills in any job I have to build powerful solutions for people and make it a great experience for them.