
Augmented/Virtual Reality FAQ
Augmented Reality (AR) and Virtual Reality (VR) have moved from experimental demos to real products used in gaming, training, manufacturing, healthcare, education, and more. This FAQ answers the most common questions about AR/VR (often grouped under XR for “Extended Reality”) with practical, up-to-date context for developers, job seekers, and anyone exploring the space.
AR vs VR vs MR vs XR: what’s the difference?
- Virtual Reality (VR): Fully immerses you in a simulated environment and blocks most of the real world (typically with a headset).
- Augmented Reality (AR): Overlays digital content onto the real world (often via a phone/tablet camera, or AR glasses).
- Mixed Reality (MR): A subset of AR where virtual objects are more tightly “anchored” to the physical world and can interact with surfaces, depth, and occlusion.
- Extended Reality (XR): Umbrella term that includes AR, VR, and MR.
In practice, companies use these terms inconsistently. If a role says “XR developer,” it often means “AR and/or VR developer,” usually with Unity, Unreal, or native mobile AR.
What is “spatial computing” and is it different from XR?
Spatial computing is a broader concept: computing experiences that understand and use 3D space as the interface (hands, gaze, 3D windows, anchored content, etc.). Many spatial computing products are XR devices, but not all spatial computing is a headset.
If you’re building apps that place UI and objects in 3D space, handle anchors, track hands, or respond to real-world geometry, you’re working in the spatial computing domain.
What are the most common use cases for AR and VR?
Here are the areas driving real adoption today:
- Training and simulation: Safety training, medical procedures, equipment operation.
- Design and visualization: Architecture, product design reviews, digital twins.
- Remote assistance: Step-by-step guidance overlaid on real equipment.
- Education: Immersive labs, guided 3D learning, virtual field trips.
- Gaming and entertainment: Still a major driver for VR.
- Retail and marketing: Try-on experiences, product visualization.
VR often wins when full immersion matters (training, simulation, gaming). AR often wins when you need to stay grounded in the real environment (work instructions, navigation, “try before you buy”).
What skills do I need to get into AR/VR development?
AR/VR development is interdisciplinary. The skill mix depends on what you build, but the most consistently valuable foundations are:
- A real-time engine: Unity or Unreal Engine.
- 3D math basics: Vectors, transforms, quaternions, projection.
- Performance thinking: Frame time budgets, draw calls, GPU/CPU constraints.
- Interaction design: Comfortable UX in 3D (motion sickness, reach, scale).
- Debugging on device: Headsets and AR phones behave differently than desktop.
If you want a step-by-step path, start with the comprehensive guide on becoming an AR/VR developer.
What programming languages are used most in AR/VR?
The “default” languages depend on the platform and engine:
- C# for Unity.
- C++ for Unreal Engine (plus Blueprints).
- Swift (iOS) and Kotlin/Java (Android) for native mobile AR.
- JavaScript/TypeScript for WebXR.
- Python for tooling, backend services, and AI/computer vision pipelines.
If you’re choosing where to start, this guide breaks it down in depth: What programming languages are needed for AR and VR development?
Should I learn Unity or Unreal for AR/VR?
Both can lead to paid work. The best choice depends on your target industry and the kind of experiences you want to build:
- Unity: Extremely common across mobile AR, enterprise XR, and many VR titles. Large plugin ecosystem and faster iteration for many teams.
- Unreal Engine: Strong choice for high-end visuals, cinematic content, and teams that want maximum rendering control.
If you’re early in your journey and optimizing for employability, Unity is often the fastest entry point. If you’re targeting premium visuals or specific studios that use Unreal, learn Unreal.
Is WebXR a good path, or is it too limited?
WebXR is a strong path if you care about distribution and rapid iteration (a URL is easier to ship than an app install). It’s great for:
- Lightweight AR product demos
- Education experiences
- Marketing activations
Limitations are real: device support varies, performance can be tighter, and platform capabilities may lag native SDKs. But it’s improving steadily and is a useful skill, especially if you already know web development.
What hardware should I buy to learn AR/VR development?
You don’t need every headset. Pick based on what you want to build:
- For mobile AR: You can start with a recent iPhone/iPad (ARKit) or Android device that supports ARCore.
- For VR development: A standalone headset is often the easiest entry for testing iterations.
- For MR/spatial computing: Choose based on the ecosystem you want to target.
If you’re brand new and budget-conscious, start with mobile AR (lowest barrier) or WebXR (no app store friction), then expand once you have a few projects.
Why does AR/VR development feel harder than “normal” app development?
Because you’re juggling more constraints at once:
- Real-time performance: Dropped frames are not just “laggy,” they can be uncomfortable.
- 3D complexity: Coordinate spaces, transforms, spatial anchors, and device tracking.
- Device variability: Different tracking quality, input, display refresh rates, and runtime quirks.
- UX risk: What looks fine in 2D can feel confusing or physically tiring in 3D.
The upside is that these constraints are exactly why AR/VR skills can be valuable in the job market.
What is 6DoF and why does it matter?
6DoF means “six degrees of freedom”: position (x, y, z) plus rotation (pitch, yaw, roll). 6DoF tracking enables natural movement and presence.
- 3DoF (older/limited): rotation only.
- 6DoF (modern): rotation + real movement in space.
Most modern VR and MR experiences assume 6DoF and are designed around it.
What causes motion sickness in VR and how do you reduce it?
VR discomfort often happens when the eyes see motion that the body doesn’t feel (sensory mismatch). Common triggers include low frame rate, latency, and artificial locomotion.
Ways to reduce it:
- Hit stable frame rates and minimize latency.
- Prefer teleport or comfort locomotion early on.
- Avoid forced camera motion (especially rotation).
- Keep acceleration gentle and offer comfort options.
Comfort is UX, engineering, and content design combined.
What are the best AR/VR projects to build for a portfolio?
Portfolio projects that stand out usually demonstrate real interaction, not just visuals:
- A VR interaction sandbox: grabbing, throwing, UI, object snapping.
- An AR measurement or annotation tool: anchors, planes, persistence.
- A multi-user experience: networked avatars or shared anchors.
- A performance-optimized scene: profiling evidence and measurable improvements.
Keep projects small enough to finish, and document what you learned, what you optimized, and what tradeoffs you made.
What’s the difference between computer vision and AR?
Computer vision is the broader field of understanding images/video (object detection, tracking, segmentation). AR often relies on computer vision (especially for tracking and understanding the environment), but AR also includes rendering, UX, and real-time interaction.
If you like AI and perception, AR can be a great place to apply computer vision work.
Are AR/VR developer jobs actually in demand?
Demand is cyclical (like the rest of tech), but XR continues to grow in enterprise and specialized consumer segments. Roles appear under many titles:
- XR Developer / AR Developer / VR Developer
- Unity Developer / Unreal Developer
- Spatial Computing Engineer
- 3D Interaction Engineer
If you’re job hunting, don’t only search “VR.” Search for the engine/platform keywords as well.
What do AR/VR developers typically earn?
Compensation varies widely by geography, seniority, engine specialization, and company stage. For a deeper salary breakdown and factors that influence pay, see: Augmented/Virtual Reality Developer Salary in 2025.
What industries hire AR/VR developers besides gaming?
Many of the most stable roles are outside gaming:
- Manufacturing and field service
- Healthcare
- Defense and aerospace
- Automotive
- Architecture/engineering/construction (AEC)
- Education and training
In these sectors, “XR” is often a productivity tool rather than entertainment.
What’s the future of AR/VR?
The long-term direction is toward:
- Smaller, lighter devices with better displays and passthrough.
- Better input (hands, eye tracking, voice, controllers).
- More capable spatial mapping and scene understanding.
- More web and cross-platform distribution via standards.
If you want historical context for where the tech has come from (and how quickly it evolves), this overview helps: The complete AR/VR technology timeline.
Conclusion
AR and VR are not “one thing,” and that’s why the ecosystem can feel confusing at first. But the fundamentals—real-time 3D, performance discipline, and human-centered interaction design—carry across platforms. If you’re learning, pick one path (Unity VR, mobile AR, WebXR) and ship small projects consistently. Momentum matters more than perfect tooling choices.
References
1 Unity Technologies. Unity Real-Time Development Platform
2 Epic Games. Unreal Engine
3 Apple Developer. ARKit
4 Google Developers. ARCore
5 W3C Immersive Web Working Group. WebXR Device API