Design to Code Using a Custom Design System with AI

Bookmark
Rate this content

This talk explores how we built an AI-powered system that transforms Figma designs into production-ready React code using Razorpay’s custom Design System. Learn how we solved the problem of brand inconsistency in generic AI tools and created a solution that understands our unique design language, enabling faster development without compromising on quality.

This talk has been presented at React Summit US 2025, check out the latest edition of this React Conference.

FAQ

A design system is a collection of reusable components, patterns, and guidelines that help teams build consistent, high-quality UI efficiently.

While AI can rapidly generate UI code, it cannot replace the consistency, reusability, and accessibility provided by a design system like Blade.

Blade's MCP (Model Context Protocol) server connects AI models to different data sources and tools, facilitating tasks like converting Figma designs to code.

Blade uses its AI server to interact with Figma and OpenAI, gathering design context and generating UI code based on Figma designs.

Blade provides various tools such as get-blade-component-docs for documentation and get-figma-to-code for converting Figma designs to code.

High-quality documentation is crucial as it helps AI understand how to effectively utilize design system components, improving the accuracy of generated code.

Challenges included transitioning from a Figma plugin to an MCP, measuring generation accuracy, and integrating with other coding tools.

The key takeaway is to maintain concise, accurate documentation, embrace learning, automate mundane tasks, and collaborate with smart, passionate individuals.

Blade is a design system developed at Razorpay to ensure a consistent look and feel across all products, featuring over 65 reusable components and patterns.

AI can work with design systems like Blade to quickly generate UI code, enhancing productivity by combining the strengths of both AI and design systems.

Chaitanya Deorukhkar
Chaitanya Deorukhkar
19 min
21 Nov, 2025

Comments

Sign in or register to post your comment.
  • Seungho Park
    Seungho Park
    LG Electronics
    Amazing talk! I am going to try right now.
Video Summary and Transcription
Chaitanya, Principal Engineer at Atlassian, discusses the design system at Razorpay, the impact of AI on UI development, and the integration of AI with design systems for enhanced productivity. Detailed prompts for AI to build UI components can be cumbersome. Imagine a seamless process where AI interprets Figma designs to create UI. Leveraging design expertise and focusing on business logic, not writing detailed AI prompts. Blade's MCP server facilitates the magic of transforming Figma designs into UI code by collaborating with Figma and OpenAI.

1. Exploring Design Systems and AI Integration

Short description:

Chaitanya, Principal Engineer at Atlassian, discusses the design system at Razorpay, the impact of AI on UI development, and the integration of AI with design systems for enhanced productivity.

Hey, folks, I'm Chaitanya and I work as a Principal Engineer at Atlassian. I'm extremely happy to be here at React Summit today. Today I'm going to talk about the work I did at Razorpay where I worked as a Staff Engineer. And I'm going to talk about AI, UI, DS. Well, that's a lot of jargon. I wonder who created this title? Well, let's try to dive deeper into individual things and try to untangle it slowly.

First up, what is a design system? It's basically a collection of reusable components, patterns, and guidelines that help teams build consistent high-quality UI efficiently. Well, essentially you build good and you build fast. And that's exactly what we did at Razorpay. At Razorpay we built a design system called Blade and with Blade we were able to have a consistent look and feel across all of our products. We made reusable components about more than 65 plus components and patterns and everything that we were building was accessible right out of the box. And we saw about a 3x boost in designer and developer productivity. We were very happy with what we achieved and we loved Blade.

And that's when the AI boom hit us. And everybody started asking questions about do we really need a design system? Can AI do it better than the design system? Can we just replace the design system with AI? And that got us thinking and we started toying around with AI. And what we found out is that AI knows a lot. It definitely knows React and it knows TypeScript. So we asked it to build a UI. And it did a decent job. It actually built some UI but it looked like a college project. What we realized is that AI is good at building UI code really, really fast but it can't generate good UI code really, really fast. And now I know this is very controversial, right? How do you define what good UI code means? So that's what we asked ourselves at Razorpay. What does good UI code mean for Razorpay? And the answer was simple. It needs to be consistent across our products. It needs to be reusable and it needs to be accessible right out of the box. And this sounded awfully similar to something that we were already trying to solve with our design system. And it hit us that design systems and AI are a match made in heaven like Romeo and Juliet minus the tragic ending where the design system will ensure that whatever you build, it's good and AI will help you do it really, really fast. So naturally we just asked AI to use our design system and to its credit, it did a decent job because Blade is already open sourced. It has some fair bit of understanding on how to use Blade. But we ran into a bunch of errors that we were not able to solve and we started asking ourselves why did it fail? But we realized that AI knows a lot, but it doesn't know much about our design system.

2. AI-Driven UI Development with Figma Integration

Short description:

Detailed prompts for AI to build UI components can be cumbersome. Imagine a seamless process where AI interprets Figma designs to create UI. Leveraging design expertise and focusing on business logic, not writing detailed AI prompts. The integration involves linking Figma designs to Cursor IDE, generating code through Blade, and utilizing Model Context Protocol for AI interaction.

So we can't just say that, hey, go ahead and build a login page. We need to tell AI to use these specific components. We need to tell it the documentation of these components. We need to give it a few examples of how to use these components. And we need to give it some instructions about our internal coding practices. So we built a very detailed prompt to help AI understand how to build a simple login page. It had a really nice structure. It had tasks, guidelines and context, which helped make it very simple for AI to understand how to build a UI. But this is extremely tiring to do every single time. And some developers would even argue that it's easier to just write UI code than to write such detailed prompts for AI.

What would our dream prompt look like if we don't want to write such long, detailed prompts? What if we could just ask AI to build this? Just give it a Figma link and ask it to figure out how to build the exact same UI from Figma. What this would ensure that we are leveraging the expertise of our design team and we're not relying on AI to come up with a new design. It will ensure that we are not writing detailed prompts for AI and we spend most of our time actually focusing on the business logic. This sounds quite magical, right? Let's actually see if this actually works in action.

So here I have a Figma design of a simple login UI. I'm going to copy the link to the selection of this Figma design and give it to Cursor, which is our preferred IDE, and ask it to build this design. It's going to do a bunch of stuff on the right side that we're going to ignore for now. And it generated a bunch of code that we've accepted and all of this code is coming directly from Blade. And if we try to run this code, it looks exactly like the UI design that we had on Figma. That's absolute magic. But how does it actually work? So we have our Cursor IDE that talks to Blade's MCP server, which in turn talks to Blade's AI server. And this is how it actually works. To understand this better, we'll have to dive into individual pieces. So Blade's MCP is a Model Context Protocol. It's a standard protocol for AI models to connect to different data sources and tools. Think of it like a USB port for AI. It makes connecting to new data or tools simple and universal. It works in a client server setup, where you have MCP clients like Cursor or Cloud Desktop, and you have MCP servers like Blade's MCP, which provide very specific data or features. And this is still a lot of jargon, and it's hard to understand what it actually does. How I understood it as if you have Alexa at home, and you ask Alexa to play uptown funk, Alexa is hopefully not going to sing the song herself.

Check out more articles and videos

We constantly think of articles and videos that might spark Git people interest / skill us up or help building a stellar career

Powering Cody Coding Assistant Using LLMs
C3 Dev Festival 2024C3 Dev Festival 2024
29 min
Powering Cody Coding Assistant Using LLMs
This Talk explores the world of coding assistants powered by language models (LLMs) and their use cases in software development. It delves into challenges such as understanding big code and developing models for context in LLMs. The importance of ranking and code context is discussed, along with the use of weak supervision signals and fine-tuning models for code completion. The Talk also touches on the evaluation of models and the future trends in code AI, including automation and the role of tasks, programming languages, and code context.