You’ve probably already tried the obvious version of this task. Install a player, paste a video URL, render <Video />, and expect the feature to be done by lunch. Then Android refuses to cooperate, iOS autoplays only when muted, fullscreen behaves differently on real devices, and your FlatList turns into a battery drain the moment multiple videos mount at once.
That’s why a good react native video example can’t stop at “it plays on the simulator.” A production player has to survive native configuration, flaky mobile networks, layout edge cases, and real user behavior like scrubbing, backgrounding, and scrolling through feeds. The JavaScript layer is only part of the story.
Starting Your React Native Video Journey
A lot of teams hit the same sequence of problems.
First, the basic player works in one environment and fails in another. Then someone adds custom controls, and now progress updates stutter or seeking becomes unreliable. Then product asks for autoplay in a feed, live streaming, or a looping hero video on login, and the original quick implementation starts fighting the platform.
The good news is that React Native has a mature path for this. The react-native-video library was first released in 2016 and now sees over 6.5 million weekly npm downloads. It’s also used in an estimated 70% of React Native projects involving media in hubs like San Francisco, with standard examples built around props such as source={{ uri: '...' }}, controls={true}, and resizeMode='contain' for reliable rendering, according to ImageKit’s react-native-video overview.
That level of adoption matters. It means you’re not stitching together a niche player library with shallow docs and abandoned native code.
Practical rule: If video is even moderately important to your app, test on a real Android device early. Simulators hide too many playback, buffering, and decoder issues.
The react native video example in this guide starts simple, but it won’t stay simplistic. It covers the parts most tutorials skip: Android native setup, iOS behavior that surprises web-focused teams, list performance, HLS playback hooks, and debugging patterns that prove useful when the app ships outside your office Wi-Fi.
Choosing Your Library react-native-video vs expo-av
Before writing player UI, pick the right foundation. For most React Native teams, this decision comes down to react-native-video or expo-av.

They overlap in the obvious ways. Both can play video. Both can fit straightforward app requirements. A key difference emerges when your app needs tighter native control, more custom UI, or behavior that has to feel predictable across iOS and Android.
What react-native-video does better
react-native-video is the stronger choice when video is a core feature instead of a decorative add-on.
It gives you direct access to the player through props and refs, works well with custom controls, and fits bare React Native projects naturally. If your roadmap includes HLS streams, feed autoplay logic, subtitles, analytics integration, or platform-specific tuning, this library gives you more room to work.
Its trade-off is setup friction. You have native dependencies, platform-specific quirks, and more responsibility for your own UI.
Where expo-av makes sense
If your team is deep in Expo and wants fast setup with fewer native concerns, expo-av is easier to live with. That’s especially true for prototypes, internal tools, lightweight media screens, or apps where video isn’t central to retention.
The trade-off is abstraction. Once your requirements get more demanding, that abstraction can start to feel restrictive. You can read a broader Expo-oriented setup perspective in this React Native Expo tutorial.
Quick comparison
| Criteria | react-native-video | expo-av |
|---|---|---|
| Setup | More native work | Smoother in Expo projects |
| Customization | Strong, especially for custom controls and native behavior | Good for common cases, less flexible for advanced needs |
| Best fit | Media-heavy apps, streaming, custom UX | Rapid prototyping, simpler product requirements |
| Maintenance style | More hands-on | More framework-guided |
Use
expo-avwhen convenience is the main constraint. Usereact-native-videowhen playback behavior is a product feature.
For the rest of this guide, react-native-video is the better teaching vehicle because it exposes the parts you need to understand when your player moves from demo to production.
Building the Core Video Player Component
The first version should be boring. That’s a compliment.
A stable player starts with a plain component that renders correctly, respects aspect ratio, and exposes enough hooks for later customization.

Install the library cleanly
For a bare React Native app:
Install the package:
yarn add react-native-videoOn iOS, install native pods:
cd ios && pod installRebuild the app. Don’t trust hot reload after native dependency changes.
If you’re on Android and something looks off after install, do a clean rebuild instead of chasing ghosts in Metro. Video libraries touch native playback layers. Old build artifacts cause confusing failures.
A minimal player that works
This is the version I’d start with before touching custom controls:
import React from 'react';
import { StyleSheet, View } from 'react-native';
import Video from 'react-native-video';
export default function BasicVideoPlayer() {
return (
<View style={styles.container}>
<Video
source={{ uri: 'https://example.mp4' }}
style={styles.video}
controls={true}
resizeMode="contain"
/>
</View>
);
}
const styles = StyleSheet.create({
container: {
flex: 1,
justifyContent: 'center',
alignItems: 'center',
backgroundColor: '#111',
},
video: {
width: 300,
height: 300,
backgroundColor: '#000',
},
});
That shape is consistent with the standard example pattern documented in the earlier cited library overview: source, controls, and resizeMode='contain'.
Why these props matter
source looks trivial, but it drives a lot of debugging. Start with a known-good MP4 before trying signed URLs, manifests, or authenticated endpoints. If playback fails on a simple file, your issue is usually native setup or device compatibility, not app logic.
controls={true} is useful at first because it removes your custom UI from the debugging path. If users can play and scrub with built-in controls, your playback layer is healthy.
resizeMode="contain" is the safer default for debugging and mixed aspect ratios. It avoids unexpected cropping and makes it obvious whether the video is rendering at all.
A visual walkthrough helps if you want to compare your setup against a working implementation:
Add a ref early
Even if you’re not seeking yet, wire in a ref now. You’ll need it for scrubbing, replay, and fullscreen actions.
import React, { useRef } from 'react';
import { StyleSheet, View, Button } from 'react-native';
import Video from 'react-native-video';
export default function VideoWithRef() {
const videoRef = useRef<Video | null>(null);
return (
<View style={styles.container}>
<Video
ref={videoRef}
source={{ uri: 'https://example.mp4' }}
style={styles.video}
controls={true}
resizeMode="contain"
/>
<Button
title="Jump to 30s"
onPress={() => videoRef.current?.seek(30)}
/>
</View>
);
}
const styles = StyleSheet.create({
container: {
flex: 1,
justifyContent: 'center',
padding: 16,
backgroundColor: '#111',
},
video: {
width: '100%',
height: 240,
backgroundColor: '#000',
},
});
That small decision saves refactoring later. It also gives you a clean path toward a more realistic react native video example where the player is part of a larger screen, not an isolated demo.
Crafting Custom Player Controls for a Better UX
The built-in controls prop is fine for proving playback. It’s not how polished apps ship.
Native controls often clash with app branding, behave differently across platforms, and make it harder to add product-specific actions like skip intros, replay buttons, or analytics events tied to user interaction. A custom overlay gives you consistency.

Start with state you actually need
You don’t need a giant reducer to build useful controls. Most players need these pieces of state:
pausedfor play and pausedurationfromonLoadcurrentTimefromonProgressisBufferingfromonBuffer- A
reffor seeking and fullscreen actions
Here’s a working custom player:
import React, { useRef, useState } from 'react';
import {
ActivityIndicator,
Pressable,
StyleSheet,
Text,
View,
} from 'react-native';
import Slider from '@react-native-community/slider';
import Video from 'react-native-video';
export default function CustomPlayer() {
const videoRef = useRef<Video | null>(null);
const [paused, setPaused] = useState(false);
const [duration, setDuration] = useState(0);
const [currentTime, setCurrentTime] = useState(0);
const [isBuffering, setIsBuffering] = useState(false);
const formatTime = (time: number) => {
const mins = Math.floor(time / 60);
const secs = Math.floor(time % 60);
return `${mins}:${secs < 10 ? `0${secs}` : secs}`;
};
return (
<View style={styles.wrapper}>
<Video
ref={videoRef}
source={{ uri: 'https://example.mp4' }}
style={styles.video}
paused={paused}
resizeMode="contain"
controls={false}
onLoad={({ duration }) => setDuration(duration)}
onProgress={({ currentTime }) => setCurrentTime(currentTime)}
onBuffer={({ isBuffering }) => setIsBuffering(isBuffering)}
/>
{isBuffering && (
<View style={styles.bufferOverlay}>
<ActivityIndicator size="large" color="#fff" />
</View>
)}
<View style={styles.controls}>
<Pressable onPress={() => setPaused(!paused)} style={styles.button}>
<Text style={styles.buttonText}>{paused ? 'Play' : 'Pause'}</Text>
</Pressable>
<Slider
style={styles.slider}
minimumValue={0}
maximumValue={duration || 1}
value={currentTime}
minimumTrackTintColor="#fff"
maximumTrackTintColor="#666"
thumbTintColor="#fff"
onSlidingComplete={(value) => videoRef.current?.seek(value)}
/>
<Text style={styles.time}>
{formatTime(currentTime)} / {formatTime(duration)}
</Text>
<Pressable
onPress={() => videoRef.current?.presentFullscreenPlayer?.()}
style={styles.button}
>
<Text style={styles.buttonText}>Fullscreen</Text>
</Pressable>
</View>
</View>
);
}
const styles = StyleSheet.create({
wrapper: {
backgroundColor: '#000',
borderRadius: 12,
overflow: 'hidden',
},
video: {
width: '100%',
height: 220,
backgroundColor: '#000',
},
bufferOverlay: {
...StyleSheet.absoluteFillObject,
justifyContent: 'center',
alignItems: 'center',
},
controls: {
padding: 12,
backgroundColor: '#111',
},
button: {
paddingVertical: 8,
},
buttonText: {
color: '#fff',
fontWeight: '600',
},
slider: {
width: '100%',
height: 40,
},
time: {
color: '#ddd',
marginBottom: 8,
},
});
What works better than default controls
The biggest UX improvement isn’t visual polish. It’s control over state transitions.
When buffering starts, you can show a spinner. When users scrub, you can decide whether to update the UI optimistically or wait for playback progress to catch up. When playback ends, you can show replay, advance to the next item, or keep the last frame visible.
Don’t update too much UI on every progress tick. If the whole screen rerenders on each callback, the player may stay alive while the rest of the interface gets janky.
Background and autoplay behavior
Custom controls matter even more for looping or decorative video. For login or onboarding screens, the useful baseline is paused={false}, repeat={true}, and muted={true}. Muting by default helps you comply with iOS autoplay restrictions, and LogRocket’s guide on adding videos in React Native notes that this approach has been shown to boost initial engagement by 40% in U.S.-based apps.
A simple background variant looks like this:
<Video
source={require('../assets/background.mp4')}
style={StyleSheet.absoluteFill}
paused={false}
repeat={true}
muted={true}
resizeMode="cover"
/>
That’s one of those areas where “works” and “works well” are different. If the video is decorative, don’t ship audio. If it’s interactive, expose explicit play intent.
Mastering Native Configuration and Permissions
Most playback bugs that feel random aren’t random. They come from native configuration that never got finished.
JavaScript can’t paper over a mismatched Android media stack or an iOS project that didn’t install native dependencies correctly.

Android setup that avoids common pain
On Android, be explicit about your player stack. A typical HLS live stream setup requires installing react-native-video, running pod install on iOS, and making sure Android uses a compatible ExoPlayer version. Callbacks like onBuffer and onProgress are also central to a responsive UI, and LiveAPI’s react-native-video example for HLS notes that adaptive bitrate streaming can reduce buffering by up to 80% on mobile networks.
A practical Android checklist:
- Pin compatibility: Keep your app’s Gradle and media dependencies aligned with the version expected by your installed
react-native-video. - Test HLS on device: Emulator playback can mislead you. Real hardware exposes decoder and network behavior faster.
- Watch layout settings: If scaling looks wrong on low-end devices, start by checking
resizeMode.
If your team needs more platform grounding, this React Native for iOS guide is useful context for how native configuration affects cross-platform behavior overall.
iOS setup that people skip
For iOS, pod install isn’t optional after adding native libraries. If a teammate pulls your branch and runs only Metro, they may think the player is broken when the native module just never linked.
In Info.plist, add only the permissions and capabilities your app needs. If the player fetches remote media, make sure the app can access those URLs under your network policy. If your app needs audio behavior beyond silent inline playback, configure that intentionally instead of assuming the default session will match product expectations.
Native video setup is where “it compiles on my machine” falls apart fastest. Treat iOS and Android as separate verification targets.
HLS-specific gotchas
For HLS streams, use a .m3u8 source and verify your manifest serves multiple bitrates. The player can only adapt if the stream was prepared for adaptation in the first place.
<Video
source={{ uri: 'https://example.com/stream.m3u8' }}
style={{ width: '100%', height: 240 }}
onBuffer={({ isBuffering }) => console.log('buffering', isBuffering)}
onProgress={({ currentTime }) => console.log('time', currentTime)}
muted={true}
/>
Muted autoplay is often the right first test on iOS. It removes one variable while you validate that stream delivery and rendering are working.
Advanced Video Features and Performance Tuning
The hardest production bugs rarely come from a single player on a single screen. They come from multiple players, long sessions, weak devices, and unstable networks.
That’s also where most tutorials stop helping. The project repository itself makes this gap clear: documentation covers basic playback and streaming features, but there’s much less practical guidance around memory profiling, caching strategy, and handling multiple video instances efficiently. That gap is highlighted in the react-native-video repository documentation and surrounding guidance.
Feed playback needs visibility logic
If you put videos inside a FlatList, don’t let every mounted row behave like the active row. Off-screen playback wastes battery, keeps decoders busy, and creates weird audio edge cases.
The practical pattern is:
- Track which item is visible
- Pause everything else
- Keep only the active player unpaused
- Avoid rerendering the entire list on every progress event
If you want a broader mobile tuning checklist beyond video-specific concerns, this React Native performance guide complements this part of the stack well.
A simple list approach:
const renderItem = ({ item }) => (
<Video
source={{ uri: item.videoUrl }}
style={{ width: '100%', height: 220 }}
paused={activeId !== item.id}
resizeMode="cover"
muted={activeId !== item.id}
/>
);
Buffering and network resilience belong in the UI
onBuffer isn’t just a debugging hook. It’s part of perceived performance.
If the stream stalls and the user sees nothing, they assume the app is broken. If the same stall shows a clean loading state and preserves controls, the experience feels recoverable. That distinction matters more on mobile networks than on office Wi-Fi.
A buffering spinner is not enough by itself. Users also need context, especially when recovery takes longer than a brief stall.
Streaming, subtitles, and protected content
For variable networks, prefer HLS or DASH over a single fixed MP4 when the product justifies streaming complexity. Adaptive delivery handles fluctuating bandwidth better than a one-size-fits-all asset.
Subtitles and captions also belong in the architecture conversation early, not as an afterthought. They affect controls, track selection, and layout. DRM is even more invasive. If protected content is on your roadmap, make that a library and platform evaluation item from day one because it changes native setup, testing, and release risk.
Troubleshooting Common React Native Video Bugs
When video fails, symptoms are often misleading. Audio plays but the screen is black. iOS works, Android doesn’t. Fullscreen opens and immediately exits. Those aren’t “React Native is flaky” problems. They usually point to one layer that needs attention.
Video plays on iOS but not Android
Check the Android native setup first. In practice, this often comes back to media stack compatibility, stream format handling, or a layout issue hiding the surface. Start with a simple MP4 and remove custom controls until baseline playback works.
Black screen with audio
This usually means rendering is failing, not playback. Try resizeMode="contain" first, verify the parent has real dimensions, and give the video a solid background color so you can distinguish layout bugs from decode bugs.
<Video
source={{ uri: 'https://example.mp4' }}
style={{ width: '100%', height: 240, backgroundColor: '#000' }}
resizeMode="contain"
/>
Fullscreen doesn’t behave reliably
Use a ref and call the fullscreen method from an explicit user action. Don’t mix default controls and custom fullscreen triggers unless you’ve tested both platforms thoroughly.
Network failures feel abrupt
Production apps need error boundaries and fallback UI when streams fail or connectivity drops. That gap is common in happy-path tutorials, but Stream’s fallback UI cookbook for React Native video scenarios makes the core point well: graceful degradation matters when video sources fail on diverse networks.
A minimal fallback pattern:
{hasError ? (
<View style={{ height: 240, justifyContent: 'center', alignItems: 'center', backgroundColor: '#222' }}>
<Text style={{ color: '#fff' }}>Video unavailable</Text>
</View>
) : (
<Video
source={{ uri: videoUrl }}
onError={() => setHasError(true)}
style={{ width: '100%', height: 240 }}
/>
)}
If you’re building React Native apps where media quality, debugging speed, and platform-specific behavior matter, React Native Coders is worth keeping in your regular reading stack. It’s a practical resource for shipping better iOS and Android apps without learning the same hard lessons twice.





















Add Comment