Handshake
Helping students discover the right jobs at the right time.
Full case study documenting the redesign of Handshake’s weekly job digest to make job recommendations feel more relevant and engaging for students.
10 minute read
Role
Product Designer
Context
Internship
Timeline
June 2025 - August 2025
Problem + Challenge
The "Weekly Job Digest", regularly sent to over 7M+ students, had become another piece of noise in students' inboxes.
How might we get a disengaged student to open an email and be engaged with its content enough to feel compelled to click?
Complicating this was the lack of clear success metrics. So before I could solve the problem, I first had to define it. My initial challenge was to establish what a "successful" email looked like.
Benchmarking
I looked into other job digest emails to gather thoughts on what worked, what didn’t, and why.
I focused on figuring out how we could bring these ideas into Handshake in a way that works for busy students looking for jobs.
Competitive Analysis
I also studied reengagement emails from different platforms and industries to spot common patterns.
Comparing them side by side showed me that we could further embrace this email as a re-engagement tool.
This shifted my approach from delivering jobs → designing an email that pulls students back into Handshake.
Findings
From my benchmarking, these are the main insights that informed my design decisions.
The strongest digests worked because they reduced friction and helped users quickly decide if an email was worth their time.
Quick wins
Emphasizing short and scannable content makes it easier for students to engage.
Personal touches
Small features of personalization significantly boost trust.
Clear CTAs
Effective designs funneled attention toward a single obvious call-to-action.
Hypothesis
I developed a behavioral messaging matrix, linking our student personas, user journey stages, and known motivational levers.
I created this framework to decide which message would help each student move forward at a given stage in our critical user journey.
This would allow us to methodologically create impactful re-engagement messaging across the entire student experience, rather than only during the active job search stage.
Ideation - Personalization
In addition to improving the job digest email, I proposed strategically personalized emails tailored to each student’s stage in the user journey.
Implementation Constraints
I pivoted to highlighting job details in smarter ways using the data we already had.
This led me to explore how we could add value to the existing email without technical overhead.
We dropped the personalization path because our current system couldn’t target at that granularity and it would potentially overdo send volume.
Ideation - Presenting Value
I tested multiple layouts and copy variations to see how students might scan and engage.
The core idea guiding me was: how can I make the email itself more valuable and engaging? Design critiques helped me refine dozens of these ideas into a select few.
Design Convergence
I narrowed the exploration to three clear directions for the final design.
Each variant mapped to a key hypothesis:
'Is this worth a click?' (role description)
'Do I qualify?' ("matched on: ")
Plaintext 1:1 recreation of the original job card
to test if current engagement came from content rather than design.
Plain-text styling improved scannability during internal testing and was predicted to drive higher-intent clicks, and thus all future emails were designed with plain-text.
Research
I ran a survey with over 100 students to rate each variant based on trust, clarity, relevance, and other proxies for engagement.
The results showed that highlighting qualifications made the emails feel more credible, the original digest was easy to scan, and the plaintext control proved that even simple formatting can improve readability and engagement.
Insights
I combined the strongest elements into a hybrid card design that raised perceived value by ~22%.
This design balances student needs with technical constraints and validates my initial hypotheses. I learned that students value clarity in their qualifications and a scannable structure to quickly build trust.
Reflection
I learned to conduct end-to-end experiments that lead to data-backed feature trade-offs. 😌
This project taught me how to design within real technical limits and validate ideas through structured testing.
Sharing the student feedback and design results in cross-functional reviews built confidence in my recommendations and reinforced the importance of pushing for solutions that balance impact and effort.
Contact me: jusmas@umich.edu









