Kastane: A custom ATS built to find the best software engineering talent
The TLDR
The Challenge
'Kastane' is a job applicant filtering software designed , developed by Atera Technologies Pvt. Ltd. The tool is designed specifically for recruiting and filtering software engineering talent within Sri Lanka.
My Role
I operated as the Product Design Lead, guiding the team through research, multiple design reviews, usability testing, and developer collaboration, providing them with feedback, and training, helping to teach them the ropes and polish their research and design skills.
The Process
I led the team in conducting extensive interviews, revising personas, and generating user stories. We developed an information architecture map, performed iterative wireframing, and addressed usability issues. We created high-fidelity prototypes for key flows and conducted usability testing to identify further improvements.
The Impact
I helped the team achieve a lot through this project.
By the end, we had a Demo-ready n upskilled design and research team, formalized research sprints that aligned with development sprints,
The Problem: Designing a useful talent screening software that can plug into existing HR tech
Atera recruited me to help lead the product develop their ATS tool code-named ‘Kastane’. Kastane was their job applicant recruitment and filtering software designed to attract and filter software engineering talent at scale.
The idea for the tool came out of multiple requests by clients who wanted a “plug and play solution” that integrated into their preexisting ATS and CRM tech stack “without trying to replace it”.
So, the team began designing a tool around the following questions;
My job: help the team develop a convincing MVP client demo, and validate the business use-case.
I was brought on because the team had reached an impasse — they couldn’t prioritize which features to focus on and had no prioritization structure; everything seemed equally important. I was tasked with adding structure around research and design processes, and helping the team push through.
Unlike American companies with access to recruiting departments, and external recruitment companies, Sri Lankan companies got involved in hiring decisions from the get-go.
When we aggregated our findings gathered from 6 companies, we produced the process diagram, personas, and rough information architecture diagram shown below.
While mapping out ALL our features was useful for giving us a holistic view, I eventually narrowed the team's focus for the MVP onto the Job creator features. I wanted them to validate this core with our target users and client BEFORE moving onto developing the job applicant features, and expanding the loop between design and validation.
Our main goal was to make the application process for candidates as easy and familiar as possible, while arming hiring managers with better tools to hone in on the candidates worth their time.
So we designed our final prototype features to cater to our two primary users: job creators, and job applicants.
Job Creator
(Recruiters & Hiring managers)
Create a job and publish it.
Create or use a premade skills test.
Use advanced filters to compare candidates.
Shortlist and export candidates as a
PDF / .CSV file.
Job Applicant
(Uni Students & Professionals)
Apply to a job via generated link on any website / email. (account not needed)
Upload resume, verify details, complete the skills-test, and submit.
Receive automated updates via email.
Users wanted to quickly see the most important stats across as many of their posted jobs as possible.
We created an 'All Job Postings' page with key summary stats that would communicate the most essential information at a glance.
Writing job descriptions, and manually closing jobs were a huge friction point that if left unresolved, would cost us in user adoption.
We allowed PDF uploads and plain text pasting to reduce friction and enhance familiarity - with only the title, level and category being new and mandatory.
The job closing conditions allow for automatic closing if a date or applicant number is reached.
In the time before OpenAI, we needed a feature that allowed
Our platform asks users to enter their desired skills, skill-levels, technologies, and education — all of which are connected to quick skill tests.
The central issue we heard across all our research was, “talent that did not meet our minimum criteria in years of experience, hard skills and relevant job experience”.
This was the core problem we were trying to solve for, and the greatest opportunity we had for product differentiation.
We designed a custom library of micro-tests that tested applicants' 'skill floor'.
These tests were short, LeetCode-inspired problems that make sure that applicants have the bare-minimum competency required for this role.
AKA; are they lying.
The team revised personas and user stories, which helped align stakeholders, and RnD teams along a singular vision.
We improved mid-fidelity wireframes successfully addressed usability issues in the product.
Usability testing validated the MVP's usability, while also uncovering areas that needed improvement.
The team provided comprehensive findings and recommendations for future iterations of the product.
An MVP was successfully delivered for development and released on time.
Accessing the correct user demographic for testing proved challenging, resulting in a near-random assortment of users being tested.
The applicant filtering approach was deemed contrarian and may face adoption resistance due to its steep learning curve.
Testing remained in the concept validation stage and did not progress to optimization or A/B testing.
The product's positioning as a plug-and-play solution conflicted with client preferences for an end-to-end solution.