Algorithmic Writing Assistance on Jobseekers' Resumes Increases Hires
van Inwegen, Munyikwa, Horton
2023NBER Working Paper No. 30886 (published in Management Science 2025)18 citations
Experimental evidenceCausalTheoretical model
LLM / Generative AIPlatforms / gig economyWriting / contentJunior / entry-levelHuman-AI collaborationAugmentation vs. substitution
AbstractThere is a strong association between the quality of the writing in a resume for new labor market entrants and whether those entrants are ultimately hired.We show that this relationship is, at least partially, causal: a field experiment in an online labor market was conducted with nearly half a million jobseekers in which a treated group received algorithmic writing assistance.Treated jobseekers experienced an 8% increase in the probability of getting hired.Contrary to concerns that the assistance is taking away a valuable signal, we find no evidence that employers were less satisfied.We present a model in which better writing is not a signal of ability but helps employers ascertain ability, which rationalizes our findings.
SummaryVan Inwegen, Munyikwa, and Horton conduct a randomized field experiment with 480,948 jobseekers on a large online labor platform to test whether algorithmic writing assistance on resumes causally affects hiring and post-hire satisfaction.
Main FindingTreated jobseekers who received algorithmic writing assistance on resumes were hired 8% more often and earned 10% higher hourly wages, with no evidence that employers were less satisfied with hired workers, supporting the 'clarity view' that better writing helps employers assess ability rather than masking low ability.
Primary Datasets
Online labor market platform data (~500K jobseekers)
Secondary Datasets
LanguageTool error detection software output
- Key Methods
- Randomized controlled trial (field experiment) on online labor platform with ~481K jobseekers randomly assigned to receive algorithmic writing assistance versus control
- Sample Period
- 2021-2022
- Geographic Coverage
- International
- Sample Size
- 480,948 jobseekers allocated to experiment (June-July 2021); 194,700 approved jobseekers with non-empty resumes in main analysis sample
- Level of Analysis
- Individual
- Occupation Classification
- None
- Industry Classification
- None
NotesNBER WP 30886; published in Management Science (2025). Treated jobseekers received AI writing assistance on resumes and were hired 8% more often. No evidence employers were less satisfied.
[Claude classification]: Published version in Management Science (2025). AI tool studied is a non-generative writing assistance tool (Grammarly-like), not ChatGPT. Treatment improved writing quality (5% fewer errors overall, larger effects for non-native English speakers) but did not change jobseeker search behavior. Paper includes theoretical model showing how reduced resume-writing costs can increase hiring without reducing match quality. Authors use sentiment analysis of employer reviews but this is a methodological tool, not the focus of study. Power analysis shows study could detect effects as small as 0.2 standard deviations in ratings.
[Claude classification]: Published version in Management Science (2025). AI tool studied is a non-generative writing assistance tool (Grammarly-like), not ChatGPT. Treatment improved writing quality (5% fewer errors overall, larger effects for non-native English speakers) but did not change jobseeker search behavior. Paper includes theoretical model showing how reduced resume-writing costs can increase hiring without reducing match quality. Authors use sentiment analysis of employer reviews but this is a methodological tool, not the focus of study. Power analysis shows study could detect effects as small as 0.2 standard deviations in ratings.
[Claude classification]: Published version in Management Science (2025). AI tool studied is a non-generative writing assistance tool (Grammarly-like), not ChatGPT. Treatment improved writing quality (5% fewer errors overall, larger effects for non-native English speakers) but did not change jobseeker search behavior. Paper includes theoretical model showing how reduced resume-writing costs can increase hiring without reducing match quality. Authors use sentiment analysis of employer reviews but this is a methodological tool, not the focus of study. Power analysis shows study could detect effects as small as 0.2 standard deviations in ratings.
[Claude classification]: Published version in Management Science (2025). AI tool studied is a non-generative writing assistance tool (Grammarly-like), not ChatGPT. Treatment improved writing quality (5% fewer errors overall, larger effects for non-native English speakers) but did not change jobseeker search behavior. Paper includes theoretical model showing how reduced resume-writing costs can increase hiring without reducing match quality. Authors use sentiment analysis of employer reviews but this is a methodological tool, not the focus of study. Power analysis shows study could detect effects as small as 0.2 standard deviations in ratings.
[Claude classification]: Published version in Management Science (2025). AI tool studied is a non-generative writing assistance tool (Grammarly-like), not ChatGPT. Treatment improved writing quality (5% fewer errors overall, larger effects for non-native English speakers) but did not change jobseeker search behavior. Paper includes theoretical model showing how reduced resume-writing costs can increase hiring without reducing match quality. Authors use sentiment analysis of employer reviews but this is a methodological tool, not the focus of study. Power analysis shows study could detect effects as small as 0.2 standard deviations in ratings.
[Claude classification]: Published version in Management Science (2025). AI tool studied is a non-generative writing assistance tool (Grammarly-like), not ChatGPT. Treatment improved writing quality (5% fewer errors overall, larger effects for non-native English speakers) but did not change jobseeker search behavior. Paper includes theoretical model showing how reduced resume-writing costs can increase hiring without reducing match quality. Authors use sentiment analysis of employer reviews but this is a methodological tool, not the focus of study. Power analysis shows study could detect effects as small as 0.2 standard deviations in ratings.
[Claude classification]: Published version in Management Science (2025). AI tool studied is a non-generative writing assistance tool (Grammarly-like), not ChatGPT. Treatment improved writing quality (5% fewer errors overall, larger effects for non-native English speakers) but did not change jobseeker search behavior. Paper includes theoretical model showing how reduced resume-writing costs can increase hiring without reducing match quality. Authors use sentiment analysis of employer reviews but this is a methodological tool, not the focus of study. Power analysis shows study could detect effects as small as 0.2 standard deviations in ratings.
[Claude classification]: Published version in Management Science (2025). AI tool studied is a non-generative writing assistance tool (Grammarly-like), not ChatGPT. Treatment improved writing quality (5% fewer errors overall, larger effects for non-native English speakers) but did not change jobseeker search behavior. Paper includes theoretical model showing how reduced resume-writing costs can increase hiring without reducing match quality. Authors use sentiment analysis of employer reviews but this is a methodological tool, not the focus of study. Power analysis shows study could detect effects as small as 0.2 standard deviations in ratings.
[Claude classification]: Published version in Management Science (2025). AI tool studied is a non-generative writing assistance tool (Grammarly-like), not ChatGPT. Treatment improved writing quality (5% fewer errors overall, larger effects for non-native English speakers) but did not change jobseeker search behavior. Paper includes theoretical model showing how reduced resume-writing costs can increase hiring without reducing match quality. Authors use sentiment analysis of employer reviews but this is a methodological tool, not the focus of study. Power analysis shows study could detect effects as small as 0.2 standard deviations in ratings.
[Claude classification]: Published version in Management Science (2025). AI tool studied is a non-generative writing assistance tool (Grammarly-like), not ChatGPT. Treatment improved writing quality (5% fewer errors overall, larger effects for non-native English speakers) but did not change jobseeker search behavior. Paper includes theoretical model showing how reduced resume-writing costs can increase hiring without reducing match quality. Authors use sentiment analysis of employer reviews but this is a methodological tool, not the focus of study. Power analysis shows study could detect effects as small as 0.2 standard deviations in ratings.
[Claude classification]: Published version in Management Science (2025). AI tool studied is a non-generative writing assistance tool (Grammarly-like), not ChatGPT. Treatment improved writing quality (5% fewer errors overall, larger effects for non-native English speakers) but did not change jobseeker search behavior. Paper includes theoretical model showing how reduced resume-writing costs can increase hiring without reducing match quality. Authors use sentiment analysis of employer reviews but this is a methodological tool, not the focus of study. Power analysis shows study could detect effects as small as 0.2 standard deviations in ratings.