Exploring AR Try-On with React Native

This ad is not shown to multipass and full ticket holders
React Summit US
React Summit US 2025
November 18 - 21, 2025
New York, US & Online
The biggest React conference in the US
Learn More
In partnership with Focus Reactive
Upcoming event
React Summit US 2025
React Summit US 2025
November 18 - 21, 2025. New York, US & Online
Learn more
Bookmark
Rate this content

React Native can be much more than a toolkit for building mobile UIs in JavaScript. We’ll explore how to use features beyond the core library, and use the built-in native module system to integrate AR capabilities into your mobile app.

This talk has been presented at React Summit 2022, check out the latest edition of this React Conference.

FAQ

AR Tryon in React Native is a feature that allows users to try on virtual items, such as shoes or glasses, using Augmented Reality. This is implemented using the underlying AR platforms of iOS (ARKit) and Android (ARCore) with React Native bridges like Viro.

Kadi is the head of mobile development at Formidable and has been an engineer for about ten years. She has been working with React Native for the past five years.

AR (Augmented Reality) enhances the real world with computer-aided graphics, usually viewed through a screen. VR (Virtual Reality) is a fully simulated experience that requires a VR headset to provide an immersive experience.

Examples of AR Tryon in e-commerce include Amazon's AR-powered virtual shoe tryout in their iOS app, Bailey Nelson's AR glasses try-on, and Gucci's AR features for trying on shoes, makeup, and nail polishes.

ARKit is Apple's AR platform available from iOS 11 onwards, launched in 2017. ARCore is Google's AR platform for Android, launched in 2018 and supported on Android 7 and onwards.

Viro is a React Native bridge for ARKit and ARCore, allowing developers to implement AR and VR features in their React Native applications without writing native code.

The Snap CameraKit SDK is an AR technology library provided by Snap Inc. It powers various AR features in the Snapchat app and can be used for virtual try-on experiences in other applications. It supports both front and back cameras and various AR experiences like makeup and clothes.

Given the tight deadline of weeks, the team opted for off-the-shelf solutions like Wanna and Snap CameraKit SDKs instead of building a custom AR solution from scratch. This allowed them to meet the client's requirements efficiently.

The client required an AR shoe try-on feature to be integrated into their main shopping app, with a bespoke UI, minimal bundle size, and compatibility with lower API levels. The project had a tight deadline of weeks rather than months.

Learning native code (Swift for iOS and Kotlin for Android) enhances the capabilities of React Native developers by allowing them to integrate native-only SDKs and write custom bindings, thereby expanding the range of features they can implement in their applications.

Kadi Kraman
Kadi Kraman
20 min
17 Jun, 2022

Comments

Sign in or register to post your comment.
Video Summary and Transcription
This Talk discusses exploring AR Tryon with React Native, implementing AR try-on experiences in e-commerce apps, and considerations for AR development. It also covers the integration of AR platforms like ARKit and ARCore with React Native using the Duvero bridge. The Talk highlights the use of off-the-shelf solutions like Wanna's SDK for virtual try-on and Snap's AR technology and shopping extension. The importance of creating 3D models for AR Try-On and the challenges of writing native code for AR development are also mentioned.

1. Introduction to AR and VR

Short description:

I'm excited to talk about exploring AR Tryon with React Native. I'll discuss what's happening in the AR and VR space and share a case study of implementing a virtual try on feature. AR enhances the real world with computer-aided graphics, while VR offers a fully simulated experience. AR is becoming an expected part of the shopping experience, as seen with Amazon's AR-powered virtual shoe tryout. Examples include Bailey Nelson's AR try-on for glasses and Gucci's app with various AR features.

I'm really excited to be here today, and I'm going to be talking about exploring AR Tryon with React Native. So, here is the intro slide.

Hi, my name is Kadi. As Yanni said, amongst his very kind words, I'm currently the head of mobile development at Formidable. I've been an engineer for about ten years, but for the past five years since 2017, I've been building things in React Native. I've been very fortunate to be able to work on some really exciting projects in React Native, and I'm very excited to share one of them with you today.

So, in this talk, it's going to be in two parts. So, first, we are going to talk in general about what's happening in the AR VR space, and in the second part, I'm going to go through a bit of a case study of how we actually implemented a virtual try on feature in a React Native application.

So to kick us off, what's currently happening in the AR and VR space? So I'll do a quick clarification on the difference between AR and VR, because we tend to use those terms together. VR stands for virtual reality and it is a fully simulated experience, so it may look like the real world or it may look completely different. So the distinguishing feature in a VR experience is that you will have some sort of a VR headset, so it will be an Oculus or a Google Cardboard or an Index VR, but the point is, you need to wear something in order to have this immersive experience, and an example of that is, for example, Beatsaber which you'll see on the screen.

AR stands for Augmented Reality and it is a process in which the real world is enhanced by some computer-aided graphics. So usually, for an AR experience, you'll be looking through a screen, so it could be your phone screen, it could be your laptop screen that has a camera that's recording the real world, and then your device will add some kind of computer-aided enhancements. So for example, you could use it to place furniture in your space. If you've followed the news recently, you might have seen that last week there was a news story making rounds that Amazon has launched an AR-powered virtual shoe tryout experience in their iOS app. Now this in itself is not particularly groundbreaking. Amazon are nowhere near the first, and they're not going to be the last company that launches something like that. But the reason that I found it significant is that it's speaking to us of a trend that is happening more and more. Which is that in e-commerce, AR is going to stop being a gimmick. It's not stopping the cool thing that maybe 5% of the people would use. And it's more and more starting to be an expected part of our shopping experience. And just to give you a couple of examples of myself using AR in my shopping experience. So, these were a couple of things that I used before I even knew that I was going to be giving this talk. So, Bailey Nelson is a company that sells glasses. I like their glasses quite a lot. And on their website, they have an AR try-on where you can try on glasses before you purchase them. So, another thing, so, Gucci have an app that's just for experience features. They have a couple of different AR features. They have a shoe try on. They have a makeup try on and in this example, you can also try on different nail polishes.

2. Moving into an Unfurnished House

Short description:

I recently moved into a completely unfurnished house and had to furnish it from scratch. I struggled with placing a sofa in my weirdly shaped living room, but ultimately settled for a two-seater sofa.

I recently moved house. Well, I say recently. It was the beginning of the year. And it was my first time moving into a completely unfurnished house. It was my first time moving into, sorry, I can hear some echo. It was my first time moving into a completely unfurnished house, so I needed to furnish it from scratch. And here was me trying to figure out how to place a sofa into my very weirdly shaped living room. So, as you can see, that corner sofa was wishful thinking, but I did end up going with a smaller version of this, um, the two-seater sofa.

3. Introduction to AR Platforms and Duvero

Short description:

Both iOS and Android provide AR platforms called ARKit and ARCore respectively. React Native has a bridge called Duvero that connects React Native with ARKit and ARCore. Duvero is an open-source library that simplifies AR development in React Native.

Now you might be wondering, how do I do this in React Native? Well, with everything in React Native, we're going to start by looking at the underlying platforms. Both iOS and Android provide an AR platform. In iOS, it's called ARKit. It was launched in 2017 and it's available from iOS 11 onwards. In Android, not to be far behind, launched the equivalent, on the Android platform, is called ARCore and it was launched in 2018 and it's supported on Android 7 and onwards. As ever with React Native, someone somewhere has written the React Native bridge for the native libraries, and in this case, it's called Duvero, which basically builds the React Native bridge for ARKit and ARCore. It's an open-source library. I think they've moved it into the community space now. They're not actively maintaining it. But if you just want to try out AR in React Native with no native code for yourself to write, then this would be a good place to start.

4. Implementing AR Try-On

Short description:

We implemented a virtual try-on experience in a React Native app for an e-commerce client. They wanted users to try shoes before purchasing, as part of the main shopping app. The UI was designed before finding a provider or building any code. Due to limited resources, we couldn't start from scratch. We found a suitable solution based on these requirements.

Now, let's look at how we actually implemented a virtual try-on experience in a real world React Native app. As always, let's start with the requirements. So, what our client needed was an AR shoe try-on experience. So, we are in the e-commerce space. We are selling shoes amongst other things, and on our product details page, we want the users to be able to try the shoe before they purchase it, and hopefully improve conversion. They wanted this to be part of the main shopping app. Some companies have decided to have a separate app specifically for experience features, so gucci is an example of that, but for this client, they wanted it to be part of the application that all the users will have, regardless of whether they actually use the AR feature. They also wanted this to be a bespoke UI, so the UI was designed way before we even got to finding a provider or building any of the code. And in terms of resources, we had in the order of weeks rather than months or years to build this, meaning that it's unlikely that we'll be able to start completely from scratch. So based on these requirements, we found the solution that will work for us.

5. Considerations for AR Try-On

Short description:

To build the AR Try-On feature in our shopping app, we had to consider the bundle size, API level, and limited development time. We opted for an off-the-shelf solution for the native platform and integrated it with React Native. Learning native code, such as Swift for iOS and Kotlin for Android, can greatly enhance the capabilities of a React Native application.

So because it's part of the main shopping app, the bundle size and the API level matter, so we can't add anything that will add hundreds of megabytes of space because every single user will have to download this app regardless of whether they use the feature. Also with the API level, we need to make sure that our minimum API level is as low as possible, again, to make sure that the most amount of users are able to install this app.

And finally, because we have weeks and not months to build it, our only real option is to use an off the shelf solution for the native platform and build the React Native integration with it. And just as a side note at this point, to really take your React Native engineering to the next level, I very much recommend that you start learning native code. So, if you're someone that came from the web, like me, you might find� and your first experience with iOS development was a bit of objective C, sorry. Yeah. A bit of objective C. You might find it terrifying, what are all these brackets? On iOS, you can use Swift. And on Android, you can use Kotlin. And both of these languages are much more lightweight. They're much more functional, and much more easier to grasp for a JavaScript developer. And once you get more acquainted with the native platform, native languages, it will really enhance the amount of features you're able to add to your React Native application because it opens the door of being able to install native-only SDKs and write your own bindings comfortably.

6. AR Trion with Wanna

Short description:

We looked at a company called Wanna that specializes in virtual Trion and provides an SDK for building your own integration. The SDK allows you to try out the virtual Trion experience for shoes. We received a demo SDK from Wanna and integrated it into our app. The API was straightforward, requiring minimal native code. However, the higher minimum API levels for Android and iOS were a dealbreaker for us, as our client wanted to support older devices.

So for our AR Trion, we looked at two different companies. The first company that we had used is called Wanna. They are a company that specializes in virtual Trion. So they have an SDK where they've implemented a virtual Trion experience for shoes. And you can use their SDK to basically build your own integration. You can actually try this out yourself. So this is their example app, which is available on both the iOS and Android stores. So this is an app that uses their own SDK, and you can see what the Trion experience would look like. So from an implementation point of view, we got in touch with them. They sent us a demo SDK so we could have a look at integrating it and seeing what it would look like in our app. So here are some screenshots of when we built the integration. This was obviously a proof of concept. So the UI is nothing to write home about. But in general, the API itself was pretty straightforward. There wasn't a huge amount of native code that we had to write in order to build the integration. It was fully customizable, so we were able to just render the virtual Trion video and then do everything else in JavaScript. The thing that was kind of a dealbreaker for us is that it had quite higher minimum API levels than what we were aiming for. So it requires Android API 26 whereas we wrote 24, and it requires iOS 13 whereas we were at 11. Now this is a tradeoff for some clients, for some customers, for some use cases. This doesn't matter. And you can just aim for higher APIs. But in our use case, our client was quite sensitive to make sure that no one gets missed out because of their older API levels. So we didn't want to upgrade.

7. Snap's AR Technology and Shopping Extension

Short description:

Snap, the company behind Snapchat, has a developer portal where you can access their AR technology. They offer libraries like CameraKit, which powers Snap and provides more than just virtual try-on for shoes. They also have a new AR shopping extension in beta that creates an integrated experience for AR on e-commerce product pages.

Did you know that Snap, the company that builds Snapchat, has a developer portal? And you can use a lot of their AR technology in your own application. You can check out developers.snap.com. They actually do more than just AR try-on. But the point is that they provide a couple of libraries, most notably for us a library called CameraKit, which is basically the SDK that powers Snap itself. So, it does a lot more than just virtual try-on for shoes. It does make-up, it does clothes, it does front camera, back camera, everything you see in Snap. So, it's very, very powerful. And under certain circumstances, you can use it in your own application. And something new that's been recently built to enhance it that's currently in beta is the AR shopping extension. So, this is something that Snap is providing. It's alongside their CameraKit. And it's specifically built to create an integrated experience for AR on the e-commerce product details page. So, within the shopping experience. And using the Snap's SDK, this is what we ended up with. So, starting from the left, we have the product details page with the AR badge on the products that have an AR. Then we have the product details page with the AR badge. And finally, on the right, we have the module with the actual AR experience.

8. Implementing AR Experience

Short description:

I haven't shown you any code, which is basically a crime, so I've added some code. We use the SDKs to determine if a product has an AR experience and show the AR badge. The actual AR experience is rendered using React Native, while the camera experience is rendered natively using Stubbs SDK. We parse the product ID to the native component, fetch the products and Lens using the Stubbs SDK, and call the necessary callbacks. The end-to-end experience includes launching AR, viewing a 3D model of the shoe, swapping to the AR experience, and taking pictures to share.

I realize this is a technical talk, and I haven't written- I haven't shown you any code, which is basically a crime, so I've added some code. I'm not going to show you the native side of things, so I'm just going to show you what we actually import and use on the JavaScript side.

So, on the product listing page, we have these AR badges, and what we need to know from the SDKs, which is the source of truth on whether something has an AR experience, we pass in the product IDs, and it does its magic and tells us whether or not this product has an AR experience. And if it does, we'll show the AR badge. We use the exact same method here on the product details page.

And this is the interesting part. This is the actual AR experience. Being the React Native developers we are, we made the conscious choice to have as much of the code as possible on the React Native side. So the attribution, the product name, the camera button, and the variant picker, these are all rendered using React Native. So the only thing actually rendered natively using Stubbs SDK is the actual camera experience.

So the way that we've exposed this is as a native component. And the way it works under the scene is that we parse in the product ID to this native component. Then using this Stubbs SDK we fetch the products. So this is from the shop kit. We fetch the products for that variant. Then we call the on products loaded callback, which will then populate the products underneath the screen. Then next what the SDK does is it fetches the Lens for the selected product. So a Lens is kind of a snap terminology, but it's basically the virtual reality experience with something inside. In this case, the pair of shoes. And then when the Lens is fetched, we're going to call onLensLoaded callback, which will then tell us that it's time to stop the loading spinner and the user's good to go.

And let's look at this experience end-to-end. So here I'm on a product details page. I'm going to scroll down to the launch AR. So in this case, this is the first time we're launching it, so we have to say yes to camera, agree to terms, which we've definitely read. So here we can also look at the 3D model of the shoe, which is actually built into the Lens, which is pretty cool. So you can change colors and see what it looks like from either side as if you were holding it. And then if you swap to the AR experience, obviously you can look down and you can see the shoes on your feet, on your friend's feet. You can also mix and match and show one on your feet, one on your foot, and one of your friend's foot. That works as well. And finally, you can take a picture and share it with your friends.

QnA

Snap SDK and Q&A

Short description:

The Snap SDK required writing a significant amount of native code, but it provided a fully working example app. The UI had to be customized, but the overall experience was smooth. The SDK works from Android API 24 and iOS 11 onwards. In this talk, I discussed Viro, Wana, and Snap for developers. Shoutout to Formidable, the company I work for. Now, let's move on to the Q&A. The first question is about obtaining 3D models for shoes.

So in summary, the Snap SDK, I will say that there was quite a lot of native code that we had to write. So the previous slide that I wrote about embrace native code, it was there for a reason. I definitely had to embrace the native code for this project. But on the upside, they did provide us with a fully working example app. Obviously, the UI didn't look like anything like us. Anything like us. We had to change things quite a bit and build it from scratch, but it was a really nice help to get started. On the upside, it was a very, very smooth experience. You can tell this is an SDK that is powering a really powerful app, the Snapchat app itself. They've had lots of engineers and years and years of practice of making this as smooth and as nice as possible. So, it was a very nice end experience with kind of a minimal effort on our part. And very importantly for us, it works from Android API 24 and iOS 11 onwards.

So, in summary, in this talk, the three main things I talked about was Viro, which is a React bridge for AI. If you just want to try it, try out VR and React Native. And then for Virtual Try On in particular, we talked about Wana and we talked about Snap for developers. So, if you're interested in either of these things, I would definitely check them out. And just a shoutout to Formidable, the company I work for, thanks to whom I am here, who are awesome. We do lots of open source. Check us out at formidable.com. Thank you very much. Thank you so much.

Oh, hello. Thank you so much, Kati. Now we're going to do a brief Q and A. We're a little behind schedule, so we're going to keep this a few minutes. And then if you have more questions for Kati, you can find her at the speaker Q and A booth afterwards. So, the first question from the audience is, how did you get the 3D model for the shoes that you're selling? That's a great question. Actually, so there is another company called Vertebrae that we work with, and they are the ones that actually, well, they get shoes shipped to them. They take the 3D model of the shoe, and then they work with Snap to get it integrated. So could anybody just go and get their shoes modeled, or is this like a sort of like an enterprise deal type of situation? I think this particular one is an enterprise deal type of situation, but I'm sure there are ways to do it as an individual person as well.

Creating 3D Models and Q&A

Short description:

The easiest way to create a 3D model for AR Try-On is to use 3D modeling software. Sarah Viera will discuss modeling in Blender. Despite the challenging timeline, we found the best SDK and tool for the job. If you have more questions, please visit the speaker Q&A booth.

I think the easiest way to actually do it would be to create a 3D model in, like some kind of 3D modeling software, and use that rather than the real shoe. That's cool. And by the way, I think we do have a talk about that. Sarah is talking about, Sarah Viera is talking about modeling in Blender later, so that might be a good time.

Perfect. I have a question for you. This is not from Slido, but you were given a brief to do this within a few weeks or within the timeline of weeks. That sounds scary and difficult. How did that actually go in the end? I think there was always, I think when people tend to figure out things based on the time frames that we have. I think if we were told that we had two years to build this, we would have probably started from a much lower level, and were like, yes, we're going to build our own custom AR solution. But knowing that we had a couple of weeks, then that made that option go out the door straight away. We needed to find the best SDK, the best tool for this job.

Nice. Thank you. So now, if you do have more questions for Catty, that's all we have in Slido. So feel free to go and talk to Catty in the speaker Q&A booth. We're going to be moving on to our next talk. Thanks so much, Catty.

Check out more articles and videos

We constantly think of articles and videos that might spark Git people interest / skill us up or help building a stellar career

Building Fun Experiments with WebXR & Babylon.js
JS GameDev Summit 2022JS GameDev Summit 2022
33 min
Building Fun Experiments with WebXR & Babylon.js
Top Content
This Talk explores the use of Babylon.js and WebXR to create immersive VR and AR experiences on the web. It showcases various demos, including transforming a 2D game into a 3D and VR experience, VR music composition, AR demos, and exploring a virtual museum. The speaker emphasizes the potential of web development in the metaverse and mentions the use of WebXR in Microsoft products. The limitations of WebXR on Safari iOS are discussed, along with the simplicity and features of Babylon.js. Contact information is provided for further inquiries.
Raising the Bar: Our Journey Making React Native a Preferred Choice
React Advanced 2023React Advanced 2023
29 min
Raising the Bar: Our Journey Making React Native a Preferred Choice
Watch video: Raising the Bar: Our Journey Making React Native a Preferred Choice
This Talk discusses Rack Native at Microsoft and the efforts to improve code integration, developer experience, and leadership goals. The goal is to extend Rack Native to any app, utilize web code, and increase developer velocity. Implementing web APIs for React Native is being explored, as well as collaboration with Meta. The ultimate aim is to make web code into universal code and enable developers to write code once and have it work on all platforms.
Opensource Documentation—Tales from React and React Native
React Finland 2021React Finland 2021
27 min
Opensource Documentation—Tales from React and React Native
Documentation is often your community's first point of contact with your project and their daily companion at work. So why is documentation the last thing that gets done, and how can we do it better? This talk shares how important documentation is for React and React Native and how you can invest in or contribute to making your favourite project's docs to build a thriving community
Bringing React Server Components to React Native
React Day Berlin 2023React Day Berlin 2023
29 min
Bringing React Server Components to React Native
Top Content
Watch video: Bringing React Server Components to React Native
React Server Components (RSC) offer a more accessible approach within the React model, addressing challenges like big initial bundle size and unnecessary data over the network. RSC can benefit React Native development by adding a new server layer and enabling faster requests. They also allow for faster publishing of changes in mobile apps and can be integrated into federated super apps. However, implementing RSC in mobile apps requires careful consideration of offline-first apps, caching, and Apple's review process.
React Native Kotlin Multiplatform Toolkit
React Day Berlin 2022React Day Berlin 2022
26 min
React Native Kotlin Multiplatform Toolkit
Top Content
The Talk discusses the combination of React Native and Kotlin Multiplatform for cross-platform app development. Challenges with native modules in React Native are addressed, and the potential improvements of using Kotlin Multiplatform Mobile are explored. The integration of Kotlin Multiplatform with React Native streamlines native implementation and eliminates boilerplate code. Questions about architecture and compatibility, as well as the possibility of supporting React Native Web, are discussed. The React Native toolkit works with native animations and has potential for open-source development.
“Microfrontends” for Mobile in React Native
React Advanced 2023React Advanced 2023
24 min
“Microfrontends” for Mobile in React Native
Top Content
Watch video: “Microfrontends” for Mobile in React Native
Micro frontends are an architectural style where independent deliverable frontend applications compose a greater application. They allow for independent development and deployment, breaking down teams into feature verticals. React Native's architecture enables updating the JavaScript layer without going through the app store. Code Push can be used to deploy separate JavaScript bundles for each micro frontend. However, there are challenges with managing native code and dependencies in a micro frontend ecosystem for mobile apps.

Workshops on related topic

Introducing FlashList: Let's build a performant React Native list all together
React Advanced 2022React Advanced 2022
81 min
Introducing FlashList: Let's build a performant React Native list all together
Top Content
Featured Workshop
David Cortés Fulla
Marek Fořt
Talha Naqvi
3 authors
In this workshop you’ll learn why we created FlashList at Shopify and how you can use it in your code today. We will show you how to take a list that is not performant in FlatList and make it performant using FlashList with minimum effort. We will use tools like Flipper, our own benchmarking code, and teach you how the FlashList API can cover more complex use cases and still keep a top-notch performance.You will know:- Quick presentation about what FlashList, why we built, etc.- Migrating from FlatList to FlashList- Teaching how to write a performant list- Utilizing the tools provided by FlashList library (mainly the useBenchmark hook)- Using the Flipper plugins (flame graph, our lists profiler, UI & JS FPS profiler, etc.)- Optimizing performance of FlashList by using more advanced props like `getType`- 5-6 sample tasks where we’ll uncover and fix issues together- Q&A with Shopify team
Detox 101: How to write stable end-to-end tests for your React Native application
React Summit 2022React Summit 2022
117 min
Detox 101: How to write stable end-to-end tests for your React Native application
Top Content
Workshop
Yevheniia Hlovatska
Yevheniia Hlovatska
Compared to unit testing, end-to-end testing aims to interact with your application just like a real user. And as we all know it can be pretty challenging. Especially when we talk about Mobile applications.
Tests rely on many conditions and are considered to be slow and flaky. On the other hand - end-to-end tests can give the greatest confidence that your app is working. And if done right - can become an amazing tool for boosting developer velocity.
Detox is a gray-box end-to-end testing framework for mobile apps. Developed by Wix to solve the problem of slowness and flakiness and used by React Native itself as its E2E testing tool.
Join me on this workshop to learn how to make your mobile end-to-end tests with Detox rock.
Prerequisites- iOS/Android: MacOS Catalina or newer- Android only: Linux- Install before the workshop
How to Build an Interactive “Wheel of Fortune” Animation with React Native
React Summit Remote Edition 2021React Summit Remote Edition 2021
60 min
How to Build an Interactive “Wheel of Fortune” Animation with React Native
Top Content
Workshop
Oli Bates
Oli Bates
- Intro - Cleo & our mission- What we want to build, how it fits into our product & purpose, run through designs- Getting started with environment set up & “hello world”- Intro to React Native Animation- Step 1: Spinning the wheel on a button press- Step 2: Dragging the wheel to give it velocity- Step 3: Adding friction to the wheel to slow it down- Step 4 (stretch): Adding haptics for an immersive feel
Deploying React Native Apps in the Cloud
React Summit 2023React Summit 2023
88 min
Deploying React Native Apps in the Cloud
WorkshopFree
Cecelia Martinez
Cecelia Martinez
Deploying React Native apps manually on a local machine can be complex. The differences between Android and iOS require developers to use specific tools and processes for each platform, including hardware requirements for iOS. Manual deployments also make it difficult to manage signing credentials, environment configurations, track releases, and to collaborate as a team.
Appflow is the cloud mobile DevOps platform built by Ionic. Using a service like Appflow to build React Native apps not only provides access to powerful computing resources, it can simplify the deployment process by providing a centralized environment for managing and distributing your app to multiple platforms. This can save time and resources, enable collaboration, as well as improve the overall reliability and scalability of an app.
In this workshop, you’ll deploy a React Native application for delivery to Android and iOS test devices using Appflow. You’ll also learn the steps for publishing to Google Play and Apple App Stores. No previous experience with deploying native applications is required, and you’ll come away with a deeper understanding of the mobile deployment process and best practices for how to use a cloud mobile DevOps platform to ship quickly at scale.
Effective Detox Testing
React Advanced 2023React Advanced 2023
159 min
Effective Detox Testing
Workshop
Josh Justice
Josh Justice
So you’ve gotten Detox set up to test your React Native application. Good work! But you aren’t done yet: there are still a lot of questions you need to answer. How many tests do you write? When and where do you run them? How do you ensure there is test data available? What do you do about parts of your app that use mobile APIs that are difficult to automate? You could sink a lot of effort into these things—is the payoff worth it?
In this three-hour workshop we’ll address these questions by discussing how to integrate Detox into your development workflow. You’ll walk away with the skills and information you need to make Detox testing a natural and productive part of day-to-day development.
Table of contents:
- Deciding what to test with Detox vs React Native Testing Library vs manual testing- Setting up a fake API layer for testing- Getting Detox running on CI on GitHub Actions for free- Deciding how much of your app to test with Detox: a sliding scale- Fitting Detox into you local development workflow
Prerequisites
- Familiarity with building applications with React Native- Basic experience with Detox- Machine setup: a working React Native CLI development environment including either Xcode or Android Studio
Building for Web & Mobile with Expo
React Day Berlin 2022React Day Berlin 2022
155 min
Building for Web & Mobile with Expo
Workshop
Josh Justice
Josh Justice
We know that React is for the web and React Native is for Android and iOS. But have you heard of react-native-web—for writing an app for Android, iOS, and the web in one codebase? Just like React Native abstracts away the details of iOS and Android, React Native Web extracts away the details of the browser as well. This opens up the possibility of even more code sharing across platforms.
In this workshop you’ll walk through setting up the skeleton for a React Native Web app that works great and looks awesome. You can use the resulting codebase as a foundation to build whatever app you like on top of it, using the React paradigms and many JavaScript libraries you’re used to. You might be surprised how many types of app don’t really require a separate mobile and web codebase!
What's included1. Setting up drawer and stack navigators with React Navigation, including responsiveness2. Configuring React Navigation with URLs3. Setting up React Native Paper including styling the React Navigation drawer and headers4. Setting up a custom color theme that supports dark mode5. Configuring favicons/app icons and metadata6. What to do when you can’t or don’t want to provide the same functionality on web and mobile
Prerequisites- Familiarity with building applications with either React or React Native. You do not need to know both.- Machine setup: Node LTS, Yarn, be able to successfully create and run a new Expo app following the instructions on https://docs.expo.dev/get-started/create-a-new-app/