Skip to main content
โšก Calmops

Augmented Reality (AR): WebAR, Mobile AR Frameworks

Introduction

Augmented reality overlays digital content onto the physical world. This article covers AR development frameworks, WebAR, mobile AR, and enterprise applications.

Key Statistics:

  • AR market: $30B (2025), projected $300B by 2030
  • WebAR users: 1B+ (browser-based)
  • Enterprise AR: 60% of AR revenue
  • Apple Vision Pro: 3D apps and spatial computing

AR Technology Overview

โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
โ”‚              Augmented Reality Technology Stack                              โ”‚
โ”œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ค
โ”‚                                                                  โ”‚
โ”‚  AR Hardware                                                      โ”‚
โ”‚  โ”œโ”€โ”€ Mobile (smartphones, tablets)                             โ”‚
โ”‚  โ”œโ”€โ”€ Smart Glasses (HoloLens, Magic Leap)                      โ”‚
โ”‚  โ”œโ”€โ”€ Spatial Computing (Apple Vision Pro)                      โ”‚
โ”‚  โ””โ”€โ”€ Industrial AR (RealWear, Vuzix)                           โ”‚
โ”‚                                                                  โ”‚
โ”‚  Tracking Technologies                                           โ”‚
โ”‚  โ”œโ”€โ”€ Marker-based (QR, fiducial markers)                      โ”‚
โ”‚  โ”œโ”€โ”€ Markerless (SLAM, surface detection)                     โ”‚
โ”‚  โ”œโ”€โ”€ GPS-based (location AR)                                   โ”‚
โ”‚  โ””โ”€โ”€ Face/Body tracking                                        โ”‚
โ”‚                                                                  โ”‚
โ”‚  Development Platforms                                           โ”‚
โ”‚  โ”œโ”€โ”€ WebAR: 8th Wall, AR.js, WebXR                            โ”‚
โ”‚  โ”œโ”€โ”€ Mobile: ARKit (iOS), ARCore (Android)                    โ”‚
โ”‚  โ”œโ”€โ”€ Cross-platform: Unity, Unreal, Flutter                   โ”‚
โ”‚  โ””โ”€โ”€ Industrial: PTC Vuforia, Microsoft Dynamics               โ”‚
โ”‚                                                                  โ”‚
โ”‚  Key Capabilities                                                โ”‚
โ”‚  โ”œโ”€โ”€ Plane detection                                            โ”‚
โ”‚  โ”œโ”€โ”€ Light estimation                                           โ”‚
โ”‚  โ”œโ”€โ”€ Occlusion (depth)                                         โ”‚
โ”‚  โ”œโ”€โ”€ Spatial mapping                                           โ”‚
โ”‚  โ””โ”€โ”€ Image/object recognition                                  โ”‚
โ”‚                                                                  โ”‚
โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜

WebAR Implementation

AR.js WebAR

<!DOCTYPE html>
<html>
<head>
  <title>WebAR with AR.js</title>
  <script src="https://aframe.io/releases/1.4.0/aframe.min.js"></script>
  <script src="https://raw.githack.com/AR-js-org/AR.js/master/aframe/build/aframe-ar.js"></script>
  <style>
    body { margin: 0; overflow: hidden; }
    .arjs-video { opacity: 0.9; }
  </style>
</head>
<body style="margin: 0; overflow: hidden;">
  
  <!-- Marker-based AR -->
  <a-scene embedded arjs="sourceType: webcam; debugUIEnabled: false;">
    
    <!-- Lighting -->
    <a-entity light="type: ambient; color: #BBB"></a-entity>
    <a-entity light="type: directional; color: #FFF; intensity: 0.6" position="-0.5 1 1"></a-entity>
    
    <!-- Marker: Hiro (predefined) -->
    <a-marker preset="hiro">
      <!-- 3D Model -->
      <a-entity 
        gltf-model="url(assets/model.glb)"
        scale="0.5 0.5 0.5"
        position="0 0.5 0"
        rotation="0 45 0"
        animation="property: rotation; to: 0 405 0; loop: true; dur: 10000">
      </a-entity>
      
      <!-- Text overlay -->
      <a-text 
        value="Hello AR!" 
        position="0 1.5 0" 
        color="white" 
        align="center"
        scale="2 2 2">
      </a-text>
    </a-marker>
    
    <!-- Camera -->
    <a-entity camera></a-entity>
  </a-scene>

</body>
</html>

WebXR Implementation

// WebXR AR implementation
class WebXRManager {
  constructor() {
    this.xrSession = null;
    this.xrRefSpace = null;
    this.arHitTestSource = null;
  }
  
  async init() {
    // Check WebXR support
    if (!navigator.xr) {
      console.error('WebXR not supported');
      return;
    }
    
    const supported = await navigator.xr.isSessionSupported('immersive-ar');
    if (!supported) {
      console.error('AR not supported');
      return;
    }
  }
  
  async startAR() {
    this.xrSession = await navigator.xr.requestSession('immersive-ar', {
      requiredFeatures: ['local-floor'],
      optionalFeatures: ['hit-test', 'dom-overlay'],
      domOverlay: { root: document.getElementById('overlay') }
    });
    
    // Setup reference space
    this.xrRefSpace = await this.xrSession.requestReferenceSpace('local-floor');
    
    // Setup rendering
    const renderer = new THREE.WebGLRenderer({
      antialias: true,
      alpha: true
    });
    
    renderer.xr.enabled = true;
    await this.xrSession.updateRenderState({
      baseLayer: new XRWebGLLayer(this.xrSession, renderer)
    });
    
    // Start render loop
    renderer.setAnimationLoop(this.onXRFrame.bind(this));
  }
  
  onXRFrame(time, frame) {
    const session = frame.session;
    const pose = frame.getViewerPose(this.xrRefSpace);
    
    if (pose) {
      for (const view of pose.views) {
        const viewport = session.renderState.baseLayer.getViewport(view);
        // Render scene for each view
      }
    }
    
    // Hit testing
    if (this.arHitTestSource) {
      const hitTestResults = frame.getHitTestResults(this.arHitTestSource);
      // Handle hit test results
    }
  }
  
  async setupHitTest() {
    const session = this.xrSession;
    const viewerSpace = await session.requestReferenceSpace('viewer');
    this.arHitTestSource = await session.requestHitTestSource({
      space: viewerSpace
    });
  }
}

// Place object on detected surface
class ARPlacementManager {
  constructor(scene, xrSession) {
    this.scene = scene;
    this.session = xrSession;
    this.reticle = null;
    this.model = null;
  }
  
  createReticle() {
    // Create reticle mesh for placement preview
    const geometry = new THREE.RingGeometry(0.15, 0.2, 32);
    const material = new THREE.MeshBasicMaterial({ 
      color: 0xffffff, 
      side: THREE.DoubleSide 
    });
    
    this.reticle = new THREE.Mesh(geometry, material);
    this.reticle.rotation.x = -Math.PI / 2;
    this.reticle.visible = false;
    
    this.scene.add(this.reticle);
  }
  
  updateReticle(hitTestPose) {
    if (hitTestPose) {
      this.reticle.visible = true;
      this.reticle.position.setFromMatrixPosition(
        hitTestPose.transform.matrix
      );
    } else {
      this.reticle.visible = false;
    }
  }
  
  placeObject(modelUrl) {
    if (!this.reticle.visible) return;
    
    // Load and place 3D model at reticle position
    const loader = new THREE.GLTFLoader();
    
    loader.load(modelUrl, (gltf) => {
      this.model = gltf.scene;
      this.model.position.copy(this.reticle.position);
      this.scene.add(this.model);
    });
  }
}

ARKit Implementation (iOS)

import ARKit
import RealityKit

class ARViewController: UIViewController, ARSessionDelegate {
  
  @IBOutlet weak var arView: ARView!
  
  override func viewDidLoad() {
    super.viewDidLoad()
    setupARSession()
  }
  
  func setupARSession() {
    let configuration = ARWorldTrackingConfiguration()
    
    // Plane detection
    configuration.planeDetection = [.horizontal, .vertical]
    
    // Image tracking
    configuration.detectionImages = ARReferenceImage.referenceImages(
      inGroupNamed: "AR Resources", 
      bundle: nil
    )
    
    // People occlusion
    if ARWorldTrackingConfiguration.supportsFrameSemantics(.personSegmentationWithDepth) {
      configuration.frameSemantics.insert(.personSegmentationWithDepth)
    }
    
    // Light estimation
    configuration.isLightEstimationEnabled = true
    
    // Run session
    arView.session.run(configuration)
  }
  
  // Place object on plane
  func placeObject(at point: CGPoint) {
    let results = arView.raycast(
      from: point,
      allowing: .existingPlaneGeometry,
      alignment: .horizontal
    )
    
    if let result = results.first {
      // Create anchor at hit location
      let anchor = ARAnchor(transform: result.worldTransform)
      arView.session.add(anchor: anchor)
      
      // Add 3D model
      let entity = try! Entity.loadModel(named: "model.usdz")
      entity.position = SIMD3<Float>(0, 0, 0)
      
      // Add to anchor
      let anchorEntity = AnchorEntity(anchor: anchor)
      anchorEntity.addChild(entity)
      arView.scene.addAnchor(anchorEntity)
    }
  }
}

// Image tracking
class ImageTrackingManager: NSObject, ARSessionDelegate {
  
  func setupImageTracking() -> ARWorldTrackingConfiguration {
    let configuration = ARWorldTrackingConfiguration()
    
    // Load reference images
    guard let referenceImages = ARReferenceImage.referenceImages(
      inGroupNamed: "ProductImages",
      bundle: nil
    ) else { return configuration }
    
    configuration.detectionImages = referenceImages
    configuration.maximumNumberOfTrackedImages = 4
    
    return configuration
  }
  
  func session(_ session: ARSession, 
               didUpdate anchors: [ARAnchor]) {
    for anchor in anchors {
      if let imageAnchor = anchor as? ARImageAnchor {
        if imageAnchor.isTracked {
          // Image detected - show AR content
          showARContent(for: imageAnchor)
        } else {
          // Image lost
          hideARContent(for: imageAnchor)
        }
      }
    }
  }
}

ARCore Implementation (Android)

import com.google.ar.core.*
import com.google.ar.sceneform.*

class ARActivity : AppCompatActivity() {
  
  private lateinit var arFragment: ArFragment
  private lateinit var sceneView: SurfaceView
  
  override fun onCreate(savedInstanceState: Bundle?) {
    super.onCreate(savedInstanceState)
    setContentView(R.layout.activity_ar)
    
    arFragment = supportFragmentManager
      .findFragmentById(R.id.ar_fragment) as ArFragment
    
    setupAR()
  }
  
  private fun setupAR() {
    // Configure session
    val config = Config(arFragment.arSession).apply {
      updateMode = Config.UpdateMode.LATEST_CAMERA_IMAGE
      planeFindingMode = Config.PlaneFindingMode.HORIZONTAL_AND_VERTICAL
      lightEstimationMode = Config.LightEstimationMode.ENVIRONMENTAL_HDR
      
      // Depth
      if (Session.isDepthModeSupported(this@ARActivity)) {
        depthMode = Config.DepthMode.AUTOMATIC
      }
    }
    
    arFragment.arSession?.configure(config)
    
    // Handle taps for object placement
    arFragment.setOnTapArPlaneListener { hitResult, plane, _ ->
      placeObject(hitResult)
    }
  }
  
  private fun placeObject(hitResult: HitResult) {
    // Create anchor
    val anchor = hitResult.createAnchor()
    
    // Load 3D model
    RenderableSource.builder(this)
      .setSource(this, Uri.parse("model.glb"))
      .build()
      .thenAccept { renderable ->
        val anchorNode = AnchorNode(anchor)
        anchorNode.renderable = renderable
        
        // Add to scene
        arFragment.arSceneView.scene.addChild(anchorNode)
      }
  }
  
  // Object detection with ML Kit
  fun enableObjectDetection() {
    val classifier = ImageClassifier.createFromFile(
      this, "custom_model.tflite"
    )
    
    // Process camera frame
    arFragment.arSceneView.scene.addOnUpdateListener { frameTime ->
      val frame = arFragment.arSession?.update()
      
      frame?.let {
        for (image in it.acquireCameraImage()) {
          // Run object detection
          val input = TensorImage.from(image)
          val outputs = classifier.process(input)
          
          // Handle detected objects
        }
      }
    }
  }
}

Unity AR Development

using UnityEngine;
using UnityEngine.XR.ARSubsystems;
using UnityEngine.XR.ARFoundation;

public class ARPlacementManager : MonoBehaviour {
  
  [SerializeField] private ARRaycastManager raycastManager;
  [SerializeField] private GameObject prefabToPlace;
  [SerializeField] private GameObject placementIndicator;
  
  private GameObject spawnedObject;
  
  void Update() {
    // Update placement indicator
    UpdatePlacementIndicator();
    
    // Place object on tap
    if (Input.touchCount > 0 && Input.GetTouch(0).phase == TouchPhase.Began) {
      PlaceObject();
    }
  }
  
  private void UpdatePlacementIndicator() {
    var screenCenter = Camera.main.ViewportToScreenPoint(
      new Vector3(0.5f, 0.5f)
    );
    
    var hits = new List<ARRaycastHit>();
    raycastManager.Raycast(
      screenCenter, 
      hits, 
      TrackableType.PlaneWithinPolygon
    );
    
    if (hits.Count > 0) {
      placementIndicator.SetActive(true);
      placementIndicator.transform.position = hits[0].pose.position;
      placementIndicator.transform.rotation = hits[0].pose.rotation;
    } else {
      placementIndicator.SetActive(false);
    }
  }
  
  private void PlaceObject() {
    var screenCenter = Camera.main.ViewportToScreenPoint(
      new Vector3(0.5f, 0.5f)
    );
    
    var hits = new List<ARRaycastHit>();
    raycastManager.Raycast(
      screenCenter, 
      hits, 
      TrackableType.PlaneWithinPolygon
    );
    
    if (hits.Count > 0) {
      var pose = hits[0].pose;
      
      if (spawnedObject == null) {
        spawnedObject = Instantiate(prefabToPlace, pose.position, pose.rotation);
      } else {
        spawnedObject.transform.position = pose.position;
        spawnedObject.transform.rotation = pose.rotation;
      }
    }
  }
}

// AR Remote for debugging
public class ARRemoteManager : MonoBehaviour {
  
  [SerializeField] private ARSession arSession;
  
  void Start() {
    // Connect to AR Remote for live debugging
    // Useful for testing without deploying to device
  }
}

Enterprise AR Applications

# Enterprise AR deployment configuration
enterprise_ar:
  # Manufacturing
  manufacturing:
    use_cases:
      - name: "Assembly guidance"
        description: "Step-by-step holographic instructions"
        hardware: "HoloLens 2"
        savings: "30% faster assembly time"
        
      - name: "Quality inspection"
        description: "AR overlay for defect detection"
        hardware: "Mobile tablets"
        savings: "50% reduction in defects"
        
      - name: "Remote assistance"
        description: "Expert guidance via AR"
        hardware: "RealWear"
        savings: "40% reduction in travel"
  
  # Healthcare
  healthcare:
    use_cases:
      - name: "Surgical navigation"
        description: "AR overlay for minimally invasive surgery"
        hardware: "Microsoft HoloLens"
        
      - name: "Medical training"
        description: "3D anatomy visualization"
        hardware: "Tablets"
  
  # Retail
  retail:
    use_cases:
      - name: "Virtual try-on"
        description: "Clothing/furniture visualization"
        hardware: "Mobile"
        
      - name: "In-store navigation"
        description: "AR wayfinding"
        hardware: "Mobile"

  # Implementation
  implementation:
    mdm_solution: "Microsoft Intune"
    deployment: " MDM-managed"
    security:
      - "Data encryption at rest"
      - "Azure AD authentication"
      - "Conditional access policies"

AR Performance Optimization

// Performance optimization for AR applications

class ARPerformanceOptimizer {
  
  constructor() {
    this.targetFPS = 60;
    this.currentFPS = 60;
  }
  
  optimizeModel(model) {
    // Reduce polygon count
    this.reducePolygons(model, targetCount = 1000);
    
    // Optimize textures
    this.compressTextures(model, maxSize = 1024);
    
    // Enable LOD (Level of Detail)
    this.addLODLevels(model, [0.5, 0.25, 0.1]);
    
    // Remove unused materials
    this.cleanupMaterials(model);
  }
  
  reducePolygons(model, targetCount) {
    // Use simplification algorithm
    // Example: quadratic mesh simplification
  }
  
  compressTextures(model, maxSize) {
    const textures = model.material.textures;
    
    textures.forEach(texture => {
      if (texture.width > maxSize || texture.height > maxSize) {
        // Resize and compress
        texture.resize(maxSize, maxSize);
        texture.format = 'Basis Universal';
      }
    });
  }
  
  addLODLevels(model, distances) {
    // Create simplified versions for distance viewing
  }
  
  optimizeLighting() {
    // Use baked lighting where possible
    // Limit real-time lights to 2-3
    // Use light probes
  }
  
  enableOcclusionCulling() {
    // Only render visible objects
    // Use spatial partitioning
  }
  
  reduceDrawCalls() {
    // Combine static meshes
    // Use GPU instancing for repeated objects
  }
}

// Frame budget management
class FrameBudgetManager {
  
  constructor() {
    this.budgetMs = 16.67; // 60 FPS
    this.currentMs = 0;
  }
  
  beginFrame() {
    this.frameStart = performance.now();
  }
  
  endFrame() {
    this.currentMs = performance.now() - this.frameStart;
    
    if (this.currentMs > this.budgetMs) {
      // Skip frames or reduce quality
      this.adjustQuality();
    }
  }
  
  adjustQuality() {
    // Reduce texture quality
    // Simplify models
    // Reduce effects
  }
}

External Resources


Comments