11 Changes in iOS 11 Which You Should Be Prepared For
iOS 11 is on the way. Check out how it might affect you!
...earthquake-disaster management concepts. The play experience must remain simple—just a single scene where the ground shakes, buildings wobble, and the player can trigger or pause emergency responses through a short C# script. Platform The experience will be delivered as an AR game built with Unity. I’m targeting the standard mobile AR pipeline, so please structure the project so it can be switched to ARKit or ARCore later without major rework. Scene Content The environment only needs the essentials: • Buildings • Roads No complex destruction physics are required; a convincing camera shake, subtle object animation, and some particle dust will do the job. Keep meshes low-poly and textures lightweight so the scene stays performant on mid-range phone...
...visualise the collected data; the app will also provide over-the-air updates and diagnostic logs. • Clean, investor-ready demo: boot, pair, run a scripted scenario, and reset without developer tools. Preferred toolchain/environment Embedded Linux or RTOS on a low-power SoC (Snapdragon XR, similar), TensorFlow Lite or Edge Impulse for on-device models, BLE/Wi-Fi for connectivity, Unity/ARCore/ARKit or an equivalent lightweight AR SDK for the display layer, and React Native or Flutter for the application—open to alternatives if they reduce time-to-demo. Acceptance criteria 1. Prototype glasses operate untethered for at least 1 hour of mixed AR, voice and sensor use. 2. Voice intent accuracy ≥90 % in a quiet room. 3. Latency from sensor read to app dashboar...
...overlays, setting up interactive walkthrough paths and adding concise 3D animations that reveal how each site evolved over time. Here’s what I’ll look for in your proposal: • A brief outline of your modelling and texturing pipeline (Blender, Maya, Z-Brush, Substance, etc.) and how you handle mobile optimisation (poly budgets, LODs, PBR maps). • Your preferred tools for AR setup (Unity, Unreal, ARKit/ARCore) and any similar location-based projects you’ve shipped. • Indicative timeline and milestone breakdown from scan/data gathering through to final in-app testing. • Your current availability and bandwidth. Deliverables at each milestone will include the finished model (FBX or glTF), clean texture sets, animation files where needed...
...changes on clothing and the ability to add or remove accessories such as hats or glasses—directly inside Blender without touching the underlying mesh. Technical needs • Created natively in the latest stable Blender release. • Clean topology, subdivision-ready, UV-unwrapped and fully textured. • Full body + facial rig suitable for standard animation controls and realtime webcam tracking (ARKit, Live Link Face or similar). • All textures, materials, and separate accessory files included. Deliverables 1. .blend files for each character with complete rigs. 2. Exported FBX/GLTF versions with embedded textures. 3. A brief PDF or video walkthrough showing how to recolour garments and toggle accessories. When you reply, please outline: • ...
...installation instructions, object images, user journey. Some work already has been done but needs improvement to make it work professionally. You’ll optimise them further to build the AR based journey. Skills required: • Prior experience of implementing AR solutions mandatory. Demo of prior work is expected during the interview. • Strong experience in native mobile AR development • ARCore (Android) & ARKit (iOS) • Camera frameworks (CameraX / AVFoundation) • OpenGL / Metal • Marker-based tracking (image / QR detection) • 3D model integration & optimisation • Animation handling in AR • Mobile performance optimisation Deliverables (To be discussed further): 1. Source project with clean, well-commented code. 2. One signe...
...Unity real‑time project, so it must be clean, lightweight, and animation‑ready. Deliverables: • Ready Player Me avatar generated from my images/video • Match the cartoon style, proportions, and personality in my references • Neutral teaching pose • Idle animations (subtle breathing, head turns, natural movement) • 2–3 simple gestures (hand lift, nod, emphasis gesture) • Exported for Unity with ARKit blendshapes enabled • Clean folder structure + instructions for importing into Unity Requirements: • Experience with Ready Player Me • Ability to match character proportions and style • Basic animation skills (idle + simple gestures) • Understanding of Unity‑ready avatar export • Ability to follow reference exactl...
I'm looking for a company with extensive experience in accelerating projects and integrating AI to speed up coding and project completion. They should also have knowledge of ARCore/ARKit. I want to build a dynamic AR project that moves between cities and the world, using user-defined coordinates. This movement would be visible to everyone on their devices. My price is only $700. We can discuss everything further if you find a suitable company or individual.
...standard iOS and Android devices without overheating the hardware. Required Technical Skills: Expert React Native Developer with experience in Native Modules. 3D Graphics Specialist: Deep knowledge of / React Three Fiber or Unity C#. Audio Engineering: Experience with WebRTC audio tracks and FFT/RMS analysis for lip-syncing. Asset Management: Experience with Ready Player Me avatars and ARKit-compatible blend shapes. Deliverables: A clean, modular React Native component for the 3D Avatar. The bridge logic linking Grok's live audio to the avatar’s facial movements. A fully functional demo: The user speaks, Grok responds, and the 3D tutor talks back with perfect lip-sync....
...product on both iOS and Android. Core build • Integrate the live avatar system so users can customise and animate their 3-D persona in real time. • Hook up the body-scan workflow (I will supply the libraries and reference assets) and make sure the scan data flows cleanly into our back-end. • Layer in the Augmented Reality mode that lets the avatar stand in a user’s real environment, using ARKit on iOS and ARCore on Android. • Create the supporting database structure for user profiles, scan files and avatar metadata, with secure cloud sync. What I bring to the table • Figma screens and interaction specs for every journey. • A clear user-flow document and acceptance criteria, so we can move fast without ambiguity. What I’m look...
I’m launching a location-based augmented-reality game and need a developer who can deliver functional builds for both iOS and Android within the next month. Players will move through the real world to discover digital items and challenges, so precise geolocation and smooth AR rendering are critical. My ideal workflow is Unity (or another cross-platform engine you’re comfortable with) paired with ARKit, ARCore, and a mapping solution such as Mapbox or Google Maps SDK. I’ll supply the game concept, core mechanics, and all 3D assets; you handle coding, integrating the location services, and packaging the apps for TestFlight and Google Play internal testing. Deliverables • Full Unity (or equivalent) project with clean, well-documented code • Compiled beta...
I want to give our customers the power to stand in their own bathroom, raise their phone and instantly see our shower doors, bathtubs, shower bases, glass panels and accessories appear at true-to-life scale. The app must run on both iOS and Android. Using ARKit and ARCore (or a comparable cross-platform engine such as Unity or Flutter’s AR plugins), the camera scan should detect walls, floor and ceiling, calculate real dimensions and create an editable room model automatically. Once the scan is finished, the user selects a product from our catalogue and places it in the space. For performance reasons basic 3D models are enough, yet the scale and orientation have to be exact. Inside the configurator the user can adjust three key options—Size, Finish and Glass type&m...
I’m looking for a mobile Flutter developer to help implement a specific part of a kids project. The project focuses on AR filters, where the child sees the content in front of them in the real ...During the experience: • Eye / face movement tracking • Capture an images at a specific moments • Send the images to an existing backend The backend is already implemented and handles image processing. No backend work is required. Scope of work: • Implement AR camera filters • Eye / face tracking • Image capture and API integration • Mobile side only Platforms: • Android (ARCore) • iOS (ARKit) Preferred experience: • AR Filters • Camera / Face or Eye Tracking • Android , iOS development (Flutter) The project is fu...
I need a skilled Flutter developer to plug a very specific feature set into my existing kids-focused mobile app. The task is all on the client side: build AR camera filters that work with ARCore on Android and ARKit on iOS, track individual eye movements in real time, and capture short videos at defined moments, then push those clips to a REST endpoint that is already live and documented. You will wire up the camera feed, layer the AR assets, implement per-eye tracking accuracy (not just general face gestures), trigger recording when my code tells you to, and send the resulting MP4 to the backend with the required headers and auth token. No UI beyond a minimal overlay is expected, so you can concentrate on the functionality itself. Deliverables • Flutter module(s) that expo...
...(to be proposed by freelancer) Budget: Open (hourly or milestone-based) We will evaluate proposals based on: technical understanding architecture decisions efficiency (not just price) Required Experience We prefer developers with experience in at least two of the following: BIM / Navisworks / IFC Unity or Unreal Engine WebRTC / Pixel Streaming GIS / GPS / Map-based systems ARCore / ARKit Server-side GPU rendering This project is suitable for experienced Indian developers or teams with strong technical background and cost-effective delivery. No mobile-only AR No “download BIM to phone” solutions No generic AR demos To Apply (Mandatory) Please include: Brief explanation of how you would implement: GPS map navigation transition to 3D / AR Propose...
I’m building a mobile app and need a set of real-time face filters that feel as smooth and engaging as TikTok’s. The filters must work natively on both iOS and Android and run at 60 fps without noticeable lag on recent devices. Scope of effects...noticeable lag on recent devices. Scope of effects • Beauty enhancements – skin smoothing, tone balance, subtle eye and lip highlights. • Animal faces – ears, noses, whiskers and similar overlays that track head movement precisely. • Funny distortions – playful warps such as big-mouth, huge-eyes or wobble effects. What I expect from you – Source project set up in the AR framework you prefer (ARKit/ARCore, Unity with AR Foundation, or another proven engine). – Optimised te...
...together a small proof-of-concept augmented-reality filter that lives inside its own mobile app, not a social-media platform. The core of the experience is motion detection: the effect should react to full-body movement (for instance, when the user raises an arm or takes a step, a visual or particle effect appears). I am happy for you to choose the most efficient toolset—Unity with AR Foundation, ARKit/ARCore directly, or any comparable SDK—so long as it gives reliable body-tracking and runs smoothly on current iOS and Android devices. Deliverables • Source project with all assets and code • A compiled demo build (APK + iOS build folder) showing the body-movement trigger in action • Brief setup notes so I can open, edit, and re-export the pr...
I’m after verybasic proof-of-concept that demonstrates LiDAR-based 3D scanning on iPhone using Swift, ARKit/RealityKit, or any other native approach you find effective. When the app launches it should show a single button; tapping that button starts the LiDAR scan, stopping it generates one STL file (the only required format). Right after processing, the mesh needs to: • display inside the app so I can visually confirm the scan, and • be saved locally for offline access. Would be great if can be uploaded on gdrive if that doesn't add lotof work for this prototype. Performance is important—I’ve tested similar code in “Antigravity” and know the device can handle fast reconstruction, so I expect comparable speed here. No fancy UI, ...
I’m building a marker-based augmented-reality application that will run smoothly on both iOS and Android. The goal is to overlay step-by-step assembly guidance, safety warnings, and real-time part information directly onto machinery so technicians can work faster...of models and UI at 60 fps on an iPhone 12 and a mid-range Android device. • Simple menu to choose a machine, download its asset bundle, and view a usage log. • Xcode and Android Studio projects handed over with commented source code, a brief README for build steps, and a short screen-capture of the app in action. If you’ve shipped similar industrial AR tools or have solid Vuforia/ARCore/ARKit experience, I’d like to see that. Let me know how soon you can demo a first prototype with one ...
...moments of the September 11, 1973 coup d’état. Historical accuracy is already documented; what I now need is an XR-savvy developer (or small team) who can turn that research into a living, geolocated narrative focused on immersive storytelling and interactive scenes, running smoothly on both iOS and Android. The prototype must: • lock 3-D content to precise GPS coordinates along the route, using ARKit/ARCore (cloud anchors or a similar solution) • blend spatial audio with authentic sound archives so visitors feel helicopters overhead, speeches echoing from balconies, and distant sirens in real time • present tappable or proximity-triggered interactions that reveal each key moment of the day, rather than a passive timeline • include a lig...
...lighting, and update instantly when the user changes color or size options. What I already have – 3D product models (FBX and GLTF) – Brand guidelines, color palettes, and reference imagery What I need from you – Scene setup in Unity, Unreal, or another engine you trust for MR that runs smoothly on iOS, Android, and, if feasible, HoloLens or Meta Quest – Accurate body, hand, or foot tracking (ARKit, ARCore, MRTK, or equivalent) – Lightweight shaders so the experience performs well on mid-range devices – A simple UI: product carousel, size selector, capture/share button, and “Add to cart” callback we will wire to our existing API – Clear build instructions and commented source so my in-house devs can keep iterati...
...fast head movement. - Important: Make sure to extract and use only good-quality frames. - Important: Save intermediate data on the phone and add an option to share it via AirDrop. This will help us debug and analyze, similar to our previous Faysics test app. - Use ARKit to track the head and provide on-screen cues that guide users to take an optimal scan. - Important: When aligning the 3D mesh with ARKit, remember ARKit coordinates can be inaccurate. - Previously, we compensated using capped depth values and the ARKit face mesh. It worked sometimes, but not reliably—please try to find a more robust solution. - Before delivering the full project, provide a demo build like you did before. After reviewing the demo and the code, we will guide you on...
...annotated video clip) Primary objective: immediate coaching decisions, not data overload. ⸻ 3) Target Platform & Tech Preferences Platform: iPadOS (iPad-first experience) Preferred stack (open to discussion): • Swift / SwiftUI • AVFoundation (video capture) • Apple Vision pose estimation (2D; 3D where supported) • Core Motion (IMU signals for smoothness/intensity proxies) • Optional later: ARKit/RealityKit for LiDAR devices Constraints: • Offline-first (works on pitch, sync later optional) • Fast: near real-time scoring/feedback • Privacy: data stored locally by default; cloud sync can be phased in ⸻ 4) Core User Workflows (Must-have) A) Create session • Select athlete (or quick-add) • Choose session templat...
...What needs to happen • Work only with the textures and materials bundled inside the file—tweak colours, roughness, metallic values, or swap the existing maps as my guide specifies. • Nudge the cube’s proportions so it reads correctly in portrait AR view without breaking the UVs. • Add a lightweight, looping animation (rotation plus a subtle particle-style glow) that keeps frame-rates high on ARKit and ARCore devices. Performance targets and file output Keep polygon count and texture resolutions inside the limits I’ll share (they’re well within typical WebAR caps). When you’re done, I need: 1. The updated .blend and an optimized .glb ready for import. 2. A short screen capture or TestFlight / APK build that proves the cube ...
I need a set of ten original, AR-ready characters, all sharing a fun, cartoonish 3D style. The look should sit comfortably between low-poly and ultra-realistic: clean geometry and expressive features, but without the heavy, film-level complexity. These characters will be dropped straight into an augmented-reality environment, so working knowledge of standard pipelines—Unity, ARKit, ARCore, or similar—is helpful. Please keep poly counts sensible for mobile, apply efficient PBR textures, and rig each model for basic humanoid animation (idle, walk, simple gesture). Deliverables • 10 unique, moderately detailed 3D characters in FBX or glTF • Rigged and skinned, ready for generic humanoid animations • 1 × 512–2 K texture set per character ...
I need the attached GiftCube graphic turned into a crisp 2 D asset that will slot straight into both ARKit (iPhone) and ARCore (Android) scenes. The exact pixel dimensions are spelled out in the spec sheet I’ve included, so please mirror those numbers precisely. A transparent-background PNG is the must-have deliverable; if creating JPEG or SVG versions at the same size is straightforward for you, feel free to include them as well—I’ll gladly take the extra formats for flexibility. Colour fidelity and sharp edges are critical, because the sprite will be viewed up close inside an AR overlay. I’ll drop the finished file into Unity and RealityKit to confirm: • the dimensions match the attachment, • no artefacts appear at 100 % zoom, • load time r...
I need an experienced AR developer to create a marker-based augmented reality application for both iOS and...will be triggered by markers on physical objects in the environment. Key Requirements: - Develop an engaging and interactive AR experience - Create and integrate 3D models and animations - Ensure high performance and responsiveness on both platforms - Conduct thorough testing and debugging - Provide detailed documentation Ideal Skills and Experience: - Proficiency in AR development tools (e.g., ARKit, ARCore) - Strong background in 3D modeling and animation - Experience with mobile app development for iOS and Android - Knowledge of Unity or Unreal Engine is a plus - Excellent problem-solving skills and attention to detail Please include examples of previous AR projects in ...
...fast head movement. - Important: Make sure to extract and use only good-quality frames. - Important: Save intermediate data on the phone and add an option to share it via AirDrop. This will help us debug and analyze, similar to our previous Faysics test app. - Use ARKit to track the head and provide on-screen cues that guide users to take an optimal scan. - Important: When aligning the 3D mesh with ARKit, remember ARKit coordinates can be inaccurate. - Previously, we compensated using capped depth values and the ARKit face mesh. It worked sometimes, but not reliably—please try to find a more robust solution. - Before delivering the full project, provide a demo build like you did before. After reviewing the demo and the code, we will guide you on...
...assistants, advanced data extraction, and sophisticated recommendation/matching engines. This includes the ability to integrate AI models with complex internal databases for real-time, accurate data-driven analysis and predictions (e.g., for quoting/estimating tools). Augmented Reality (AR): Proven experience in developing or integrating AR systems, including familiarity with AR SDKs (ARCore, ARKit, Unity) and concepts of spatial computing and 3D model integration for interactive, real-world applications. API Integrations: Extensive experience with secure third-party API integrations (payment gateways, accounting software, communication services, mapping APIs). High-Scale Architecture: Demonstrated ability to design, implement, and deploy cloud-native solutions capable of hand...
...recording unless user is physically within a defined radius • Apply location-locked filters • Embed location metadata into captured media • Camera should integrate cleanly with the rest of the app backend ⸻ 4. Platform & Tech Expectations • Mobile platforms: iOS and Android • React Native, Swift, Kotlin, or hybrid approaches are acceptable • Experience with AR frameworks required (ARKit, ARCore, OpenGL, Metal, or equivalent) • Must understand performance optimization for real-time camera rendering • Code should be clean, modular, and well-documented ⸻ Nice-to-Have (Not Required) • Experience with Snapchat Lens Studio, Spark AR, or custom AR engines • Prior work on social media, camera apps, or r...
...movimiento suave y continuo, casi hipnótico, que refuerce la sensación de presencia sin distraer con cambios bruscos. Imagina una cinta fluida o una forma orgánica que respire; la clave es que nunca dé saltos ni se detenga. Entregables mínimos • Modelo optimizado (glTF/GLB o USDZ), texturas incluidas • Animación en bucle continuo • Anclaje espacial: coordenadas y sistema de seguimiento (ARKit, ARCore o WebAR) para que el público lo descubra con un teléfono estándar • Instalación de la obra en el lugar acordado • Dejar el proyecto 100% listo para que los usuarios puedan empezar a interactuar Acepto propuestas creativas sobre materiales, escala y color siempre que respeten la ...
I need an AR camera feature to my app that detects real-world surfaces in real time and lets the user place, move and scale any 3D model so it stays locked to that spot as the camera moves around it. The code has to be clean, well-commented and built with production-ready tools such as ARKit, ARCore, Unity or Unreal (I’ll take your recommendation once you’ve seen the larger project scope). Please expose a simple API so my app can feed in a model file at runtime and receive back the anchor’s world coordinates for later sessions. Deliverables: • Mobile demo project with an in-camera view, surface detection and persistent anchoring • Ability to load at least one common 3D format (glTF, FBX or OBJ) • Short integration guide and build instructions ...
I need a straightforward iPad-only utility built with Swift and ARKit that opens, starts the LiDAR camera, and helps the user capture a door’s width and height in inches. The scan should appear as an AR overlay so the two measurement lines and their numbers sit directly on top of the live image. If the automatic endpoints are a little off, the user must be able to drag each point into place with a simple touch-and-move gesture—no sliders or numeric input screens. Once the user is satisfied, a single tap on “Save” creates a PDF that includes: • the photo of the door (or window, because I’d like that option supported too) • the final width × height in inches • a timestamp • a small company logo (I’ll drop the asset...
Hello, I would like to know if it is possible for you to provide this service. We have an internal web application with an animated 3D avatar (GLB format) created using Ready Player Me. This avatar was exported with visemes (ARKit and Oculus), so with our current implementation we are able to generate TTS and synchronize it with lip movements. Additionally, we use some Mixamo animations (such as listening, talking, idle, etc.). We would like this avatar (while maintaining all its current functionality) to look more realistic. Ideally, it would be a MetaHuman, as it currently looks somewhat cartoonish or artificial. Any approach that represents a real visual improvement would be useful for us. We are currently using an AI-generated image for initial testing; conceptually, the avata...
...display context-aware information through an augmented-reality overlay. Core workflow • The GPS module keeps an accurate, low-latency lock on the device position and movement history. • That live data is fed into an AI layer—preferably a lightweight on-device model or a cloud endpoint—that looks for patterns, forecasts next likely locations, and suggests actions or insights. • An AR view (ARKit, ARCore or any comparable framework you prefer) then anchors those insights to the real world so the user sees relevant, real-time information hovering over their surroundings. What I will call “done” • Smooth background location tracking with minimal battery drain. • Predictive analytics results delivered in under two second...
...mid-range mobile devices). • Materials: PBR workflow. Textures max 2K resolution. • Compression: Must be exported using Draco compression. • Rigging: Standard humanoid skeleton with full finger rigging for gestures. 4. Facial Animation & Lip-Sync (Critical) The model must be production-ready for real-time speech driven by external data (Azure TTS): • Blendshapes: Full set of 52 ARKit-standard Morph Targets (Shape Keys). • Accuracy: Visemes must be anatomically correct to ensure natural mouth movements during speech. • Hierarchy: Clean naming convention for all morph targets (no engine-specific prefixes like "CC_" or "Unreal_"). 5. Deliverables 1. Source Files: Original project files (CC4, Blender, o...
...mid-range mobile devices). • Materials: PBR workflow. Textures max 2K resolution. • Compression: Must be exported using Draco compression. • Rigging: Standard humanoid skeleton with full finger rigging for gestures. 4. Facial Animation & Lip-Sync (Critical) The model must be production-ready for real-time speech driven by external data (Azure TTS): • Blendshapes: Full set of 52 ARKit-standard Morph Targets (Shape Keys). • Accuracy: Visemes must be anatomically correct to ensure natural mouth movements during speech. • Hierarchy: Clean naming convention for all morph targets (no engine-specific prefixes like "CC_" or "Unreal_"). 5. Deliverables 1. Source Files: Original project files (CC4, Blender, o...
I already have an iOS app in the store that relies on ARKit and the LiDAR sensor, but several of its core features need a technical tune-up. Right now the 3D scanning, object detection, and environmental mapping work, yet they struggle with accuracy and speed once models become dense. I’m looking for a Swift developer who lives and breathes Apple’s ARKit mesh APIs and has hands-on experience with LiDAR—bonus points if you have played with Space Capture, Polycam, or similar apps and understand the tricks they use to keep meshes clean and frame rates high. Here’s what I need from you: • Jump straight into the codebase and focus purely on development—UI and QA are already covered. • Refactor my existing pipeline so the raw LiDAR data ...
...mid-range mobile devices). • Materials: PBR workflow. Textures max 2K resolution. • Compression: Must be exported using Draco compression. • Rigging: Standard humanoid skeleton with full finger rigging for gestures. 4. Facial Animation & Lip-Sync (Critical) The model must be production-ready for real-time speech driven by external data (Azure TTS): • Blendshapes: Full set of 52 ARKit-standard Morph Targets (Shape Keys). • Accuracy: Visemes must be anatomically correct to ensure natural mouth movements during speech. • Hierarchy: Clean naming convention for all morph targets (no engine-specific prefixes like "CC_" or "Unreal_"). 5. Deliverables 1. Source Files: Original project files (CC4, Blender, o...
...precise Metahuman creation, cinematic-grade setup, and hands-on Unreal Engine development. The final asset will be integrated into the embodiment in AWS featured in live streaming in cinematic form, so photographic realism, correct facial topology, and believable motion are critical. Here is what I have in mind: AVATAR CORE] 1. or (full rig) 2. All PBR textures 3. ARKit/blendshape facial rig 4. Hair/groom assets [ANIMATIONS] 5. Idle, talking, breathing animations 6. Expression animations (happy/sad/curious) 7. Additional gesture loops [UNREAL ENGINE PROJECT] 8. Full UE5.7 project folder 9. Imported avatar assets (skeletal mesh, materials) 10. Animation blueprint 11. Character blueprint 12. Facial controller blueprint [PIXEL STREAMING] 13. Full packaged Pixel
I already have an iOS app in the store that relies on ARKit and the LiDAR sensor, but several of its core features need a technical tune-up. Right now the 3D scanning, object detection, and environmental mapping work, yet they struggle with accuracy and speed once models become dense. I’m looking for a Swift developer who lives and breathes Apple’s ARKit mesh APIs and has hands-on experience with LiDAR—bonus points if you have played with Space Capture, Polycam, or similar apps and understand the tricks they use to keep meshes clean and frame rates high. Here’s what I need from you: • Jump straight into the codebase and focus purely on development—UI and QA are already covered. • Refactor my existing pipeline so the raw LiDAR data ...
...Fixed-price, end-to-end development Delivery Model: Contractor/Agency delivers 100% of the finished system We are NOT hiring individual developers. We need a complete delivery team. Project Overview We need a turnkey Virtual Museum Platform built and delivered as a complete product. The system will include: 1. Web-based 3D Virtual Museum (WebGL/) 2. Mobile AR Experience (Unity + ARCore/ARKit) 3. Gamification System (points, badges, quests) 4. Backend + Admin CMS 5. 3D Models + Content Integration 6. Analytics Dashboard 7. Launch + Documentation + Warranty Support We will provide: * A concept document * Example PDF outlining museum experience models * Content and narrative * Requirements list You will deliver everything else. Required Expertise We expect your team to include: * ...
...prototipo mínimo debe incluir: • Selección de diferentes materiales de paisajismo (piedra, césped, gravilla, madera, etc.). • Visualización en tiempo real para que el usuario vea los cambios al instante. Ya he decidido trabajar con Unity o lo mas recomendaro en el mercado, como motor principal; sin embargo, espero tu sugerencia sobre librerías y frameworks complementarios (AR Foundation con ARCore/ARKit, Vuforia, WebXR si nos conviene una capa web, entre otros). Entrego como referencia un flujo muy preliminar, pero confío en que puedas plantear: 1. Un esquema claro de la web orientado a conversión. 2. La arquitectura funcional del módulo VR, con una breve justificación de la tecnología elegida y...
...expressions, and proportions. Fully rigged face + body. Should feel like a true digital clone, not a stylized character. 2. Automated Animation Pipeline I need a system that lets me generate new videos anytime without manual keyframing. For example: Input: audio or text script Output: fully animated talking video with realistic mouth movement + natural gestures This can be done via: Control Rig ARKit blendshapes Speech-to-animation plugins Python/Blueprint automation I don’t care what tools you use — just that the final workflow is simple and repeatable. 3. Accurate Lip-Sync + Natural Gestures Lip movement must match my speech realistically Should not look “AI stiff” Ability to include head turns, micro-expressions, and optional hand moveme...
...need a full-stack augmented reality solution that lets customers virtually try on our jewellery catalogue both on their phones and on an in-store large-screen display. The experience has to feel premium: crisp, photorealistic pieces that hold up whether the user is standing a metre away from a 55-inch kiosk or sitting at home on a mobile screen. Scope of work • Build native AR try-on for iOS (ARKit) and Android (ARCore) as well as a kiosk version that can run the same scene in Unity or Unreal on Windows. • Create the 3D models from my design files; every ring, pendant and earring must be modelled, textured and optimised for real-time rendering. • Allow users to rotate and zoom each piece, see it automatically aligned on their hand, neck or ear with live came...
I need an end-to-end iPad and iPhone application that lets my sales reps walk up to a doorway, scan it with LiDAR, and instantly drop a true-scale 3D door into the opening. Using ARKit (RealityKit/SceneKit) the app must: • Detect and measure the opening automatically, storing width, height, and jamb depth. • Overlay selectable 3D door models—multiple styles, materials, and fully customizable colors—so the customer can preview exactly how each option looks in their own space. • Capture screenshots or short clips for later reference. Once a model is chosen the rep should be able to tap “Generate Proposal” and have a branded PDF appear seconds later. That PDF needs to pull in: • Customer details entered on a simple form. • All s...
...editing or deleting is not required right now. Once a video is uploaded, the JSON that describes the bubble layout will be pushed to the user app via our current API. Release & delivery • Update the Flutter codebase to include the new AR layers (ARCore / ARKit compliant). • Build and pass both stores’ review processes with the new features enabled. • Supply updated source code and store-ready binaries, plus a short README so I can reproduce the builds locally. If you have recent experience shipping Flutter apps that marry ARCore/ARKit functionality with video playback, this task should feel familiar. Let me know a realistic timeline for completing the coding, preparing the store assets and guiding the releases. Video - augmented reality fl...
I have 2 mmd models converted to Genesis 8 and imported into blender with their daz rigs I need someone with experience setting up arkit blend shapes for facial mocap to build the blend shapes onto both models.
...overlay (glasses, makeup, jewellery—think virtual fitting-room). • The heavy lifting happens in an API you will build; the mobile front ends on both iOS and Android simply call it and render the result. Accuracy matters, but the breakthrough experience hinges on smooth, believable 3D try-on, so every design choice should prioritise that. If you like working with OpenCV, MediaPipe, TensorFlow, ARKit/ARCore or your own blend of frameworks, I’m open—as long as the final deliverables include: 1. A documented REST/GraphQL API that receives a camera frame (or short clip) and returns: – Face bounding box and landmarks – Calculated face-shape classification – Suggested dominant skin-tone values – The 3D mesh / trans...
...- Add photo gallery with GPS map - Build stage timeline visualization - Create analytics dashboard with charts - Enhance 3D model (click objects, show frozen elements, deviation markers) - Add user management UI - Write E2E tests (Playwright/Cypress) **Phase 3: Mobile AR Integration (CRITICAL)** - Real-time AR overlay showing expected object positions - GPS + Compass + Accelerometer fusion - ARKit (iOS) and ARCore (Android) integration - Green circles showing “Socket should be HERE” - Real-time positioning guidance - Enhance quality checks (blur, lighting, tape measure detection) - Improve camera integration and offline sync - Push notifications and haptic feedback - Test on multiple iOS/Android devices **Phase 4: ML/CV Service** - YOLOv8 object detection (sockets...
...- Add photo gallery with GPS map - Build stage timeline visualization - Create analytics dashboard with charts - Enhance 3D model (click objects, show frozen elements, deviation markers) - Add user management UI - Write E2E tests (Playwright/Cypress) **Phase 3: Mobile AR Integration (CRITICAL)** - Real-time AR overlay showing expected object positions - GPS + Compass + Accelerometer fusion - ARKit (iOS) and ARCore (Android) integration - Green circles showing “Socket should be HERE” - Real-time positioning guidance - Enhance quality checks (blur, lighting, tape measure detection) - Improve camera integration and offline sync - Push notifications and haptic feedback - Test on multiple iOS/Android devices **Phase 4: ML/CV Service** - YOLOv8 object detection (sockets...
iOS 11 is on the way. Check out how it might affect you!