Introducing Humans of Development: Chandrika (Shani) Jayant
What a shift from the private sector into international development can teach us about our work
Most of us are trying to shift away from international development. You left your private sector job to look for mission-driven work. Why?
I am Chandrika (Shani) Jayant. I've spent over a decade in private sector tech at Intel, Volkswagen Group, Nokia - building products that have the potential to reach millions of people and have shaped inclusive design and accessibility strategy at scale.
It's been a windy and fascinating journey, coming from a wonderfully generative team of accessibility computer and information science graduate students at the University of Washington, and I've always tried to be thoughtful about where to make impact as I make my way through my career.
Growing up with a close family member with a disability shaped how I see the world. I learned early that good design isn't just aesthetically pleasing, it's about dignity, access, connection, joy.
In tech, I've had the privilege of working on cutting-edge AI, computer vision, and autonomous vehicles (AVs), and there is always a tension about how and why these new tools are being developed, and a need to consider their impact on marginalized communities.
Leading accessibility research for AVs and partnering with disability organizations, I discovered both the challenges and opportunities of doing mission-driven work within corporate structures.
You can't be the only advocate in a room—you need allies across teams, leadership buy-in from the top, and grassroots momentum from the bottom. I learned the importance of building coalitions, not just within your company but across the industry.
Corporate environments have unique levers for impact—resources, scale, regulatory influence—and they also have constraints. Whether it's government, industry, nonprofits, or startups, I'm looking for places where my values can drive real change.
Right now, as I consult with mission-driven companies, I'm also volunteering with Insight World Aid and serving on the board of Sacred Mountain Sangha, exploring how different organizational models can create impact.
You led pioneering research into Human-AI teaming. What is the impact of your work?
Back in my PhD work, I used computer vision to support novel interaction methods to support blind and low vision people taking photographs. Along with Dr. Jeffrey Bigham and team, we also created VizWiz—a system that helped blind people take better photographs by crowdsourcing real-time feedback.
It was an early example of humans and AI working together seamlessly: computer vision would analyze the image quality, while human volunteers provided context about what was actually in the photo.
What made it special wasn't just the technology, it was centering the needs of people who had been excluded from photography entirely. We weren't just building assistive tech; we were reimagining what collaboration between humans and machines could look like when you start with meaningful questions.
The impact has been profound. Features inspired by this work are now built into Apple and Google's accessibility tools. But more importantly, it established a framework for ethical AI development: always involve the communities you're designing for, always consider who's left out, and always ask how intelligent systems can amplify human agency rather than replace it.
At Intel, I applied these principles to spatial computing, designing a computer vision-based wearable navigation aid for blind users that we demonstrated at CES. It's about designing systems that don't just automate, but that genuinely collaborate with human intelligence and intuition. We’re seeing the continued need for responsible tech right now as things are moving very quickly.
How do you keep ethics front and centre in fast-moving environments?
Short answer- it’s difficult! Basically, ethics can't be an afterthought or just a separate team, it has to be woven into every decision from day one. This is especially crucial as AI can make bad decisions easily and quickly. We can't keep building first and asking questions later.
At Volkswagen, I embedded inclusive design principles into our product development process from the very beginning. Every project started with questions like: Who isn't represented in this room? Whose voices are missing from our research? What are the unintended consequences if this goes wrong?
My approach to ethics is deeply informed by my practice as an eco-chaplain (someone who works with people in ecological crises) and meditator. Both traditions teach you to slow down, to listen, consider interconnectedness, and to ask how your actions ripple out into the world. When I'm in nature or in meditation, I'm reminded that everything is connected—the technologies we build, the communities they serve, the planet we all share.
To keep ethics front and center, it’s important to make it concrete and actionable through design principles, and “shifting left” as they say in software development, making sure there are metrics and guidelines touching all parts of the design cycle.
I also learned that you have to create space for these conversations wherever you work. When you're moving fast, it's tempting to assume good intentions are enough. But ethical AI requires slowing down just enough to ask the hard questions, and having the courage to change course when needed.
You’ve spent 15 years hosting radio talk shows. Tell us about a memorable interview you conducted.
Radio taught me the art of deep listening—a skill that's become more crucial than ever as AI becomes more entangled in our lives, as the beauty of process can be lost. Whether I was interviewing bands like Blonde Redhead at WNYU or showcasing emerging artists like James Blake or MIA at KEXP, radio connected me to incredibly diverse communities and musical histories I never would have encountered otherwise.
There's something magical about radio that mirrors what I try to do in my research work: creating space for voices that might not otherwise be heard, staying curious about stories that challenge your assumptions, and building bridges between different communities.
Radio taught me to approach every conversation with genuine curiosity—whether you’re talking to a punk band about their DIY ethos or how someone’s immigrant history affected how they approach music and life. And to leave room for space, taking a beat, seeing what emerges, and what is created between the listener and the speaker. Who holds the space matters.
That translates directly to user research and product strategy. When I'm interviewing a wheelchair user about their taxi experiences or a disability advocate about smart glasses, I'm using the same skills: creating space for people to share their authentic experiences, asking follow-up questions that reveal something unexpected, and listening for the stories that might reshape how we think about a problem.
Radio also taught me about the power of storytelling and communication—how to make complex ideas accessible without losing their nuance.
What concerns me is how much we're losing these communication skills as AI automates more of our interactions. The ability to have genuine conversations, to listen deeply, to pick up on subtlety and context—these fundamentally human capacities are exactly what we need to preserve and strengthen.
And to use AI thoughtfully and intentionally as it increasingly becomes a force that is woven into this world of ours.
What advice would you have for development professionals trying to pivot into the private sector?
Your systems thinking and stakeholder management skills are incredibly valuable- don't underestimate them. Development work teaches you to navigate complex, multi-party situations with limited resources and competing priorities. That's exactly what product management and strategy roles require.
But also, learn to speak private sector language. Instead of "beneficiaries," talk about "users." Instead of "interventions," talk about "solutions" or "products." Your ability to work with diverse communities and understand local contexts translates directly to user research and market expansion.
Most importantly, don't abandon your values—use them as a differentiator. Your experience working with marginalized communities gives you insights that can help companies avoid costly mistakes, tap into underserved markets, and keep their moral compass more closely aligned with goodness, which the world needs so much of right now.
The private sector needs people who understand that good products don't just solve problems efficiently—they solve problems equitably.
Bring that perspective with you, I hope you'll find there's a hunger for it.
Please contact Sulakshana Gupta if you’re a paid Career Pivot subscriber who would like to be profiled in a future Humans of Development post on Career Pivot and LinkedIn.
Love the HONY series and now love this HODev series! Well done Wayan!