Varun Innovates
Experiments, prototypes, build notes.
Public lab by Varun Siddaraju
A public lab for applied XR + AI research: experiments, research notes, build logs, and work-in-progress prototypes.
Context
Varun Siddaraju is the person and research record. Varun Innovates is the public lab. VeeRuby is the company for client delivery.
Long-form credibility and public body of work.
Experiments, prototypes, build notes.
Enterprise XR and spatial AI systems.
Current Artifacts
This section should only show work that helps a visitor understand the current direction: frameworks, interfaces, APIs, and notes that may become papers, products, or reusable tools.
A real-time context inference system using gaze, head pose, posture, hands, and spatial context to classify user state as engaged, distracted, transitioning, or idle.
An early-stage OpenXR-native platform direction exploring APIs, adaptive behaviors, and world-model experiments for XR workflows.
A rapid-prototyping bucket for WebXR, headset experiments, AI + vision integration, and real-time perception tests that are too early for polished writeups.
Now + Recent
Use the tabs to scan active work, review items, completed cleanup, and planned public notes without taking in the whole lab at once.
The threads currently shaping the lab story and research direction.
Modeling gaze, head pose, hands, posture, boundaries, and task context so XR systems can infer human state instead of reacting blindly.
The build lane is active. Current work is turning the patterns, tradeoffs, and examples into clear public notes.
Refactoring an existing product while moving it to a newer version, with the goal of cleaner architecture, compatibility, and release-ready behavior.
Work that exists, but is being tightened before it becomes stronger public evidence.
The systems manuscript is in review: tightening structure, examples, and usefulness before treating it as a finished public credential.
The draft is in review mode. Current work is tightening figures, language, and argument quality.
Redesigned Varun Siddaraju, Varun Innovates, and VeeRuby websites; now tightening polish, routing, and proof links before calling the network finished.
Recently wrapped or clarified items that explain why the public lab looks the way it does now.
The current exploration pass is complete and captured as a foundation for continuity, adaptation, and explainability across XR sessions.
The mentorship is wrapped. The useful residue is sharper thinking around world-model demos, prototype quality, and how teams move from polish to system quality.
The repo map now reflects the actual stack: Harmony design work, command centers, books, docs, experiments, and applied XR builds.
Promising threads that should become cleaner demos, notes, or conversations next.
Talking with education institutes about integrating AI tutors, adaptive feedback, and richer learning loops into VR education experiences.
Lab Map
Move through the map in order, or jump by intent. Each lane links back to this map and forward to the next lane.
Research notes hold the argument: how XR systems perceive context, mediate interface complexity, support collaboration, and explain adaptation without hiding control from users.
Uses the Gamma framing: perception of context, adaptive mediation, and collaboration orchestration as the three linked layers of the research spine.
Defines when gaze, head pose, hands, posture, boundaries, and task state are useful enough to drive an adaptive interface.
Keeps adaptation opt-in, reversible, user-tunable, and measurable through task outcomes, cognitive load, trust, and well-being.
This lane is for small, inspectable demos: adaptive layouts, continuity benches, world-model memory, and AI workflows that reveal what breaks.
Turns multimodal XR signals into interface decisions that can be logged and explained after the session.
Tests how much spatial memory must persist before an XR session feels continuous instead of reset-heavy.
Captures cases where the model reads attention, posture, or intent poorly so the next version can be more trustworthy.
Tools and SDKs are the reusable surfaces: APIs, plugins, loaders, evaluation helpers, and implementation notes worth using again.
Maps the developer layer for context descriptors, spatial memory, adaptive behaviors, and extension points.
Collects reusable patterns for gaze, hands, posture, boundaries, and task-state instrumentation.
The build log explains what changed, what was removed, what remains blocked, and what needs proof before promotion.
Tracks validation, sitemap, build artifact, and deploy-readiness work across the site network.
Logs are temporary. The better version is a page with screenshots, tradeoffs, and a clear outcome.
Product models translate lab patterns into education pilots, command centers, automation systems, and VeeRuby delivery shapes.
Explores AI tutors, adaptive feedback, and learning loops for VR education experiences.
Separates public lab thinking from scoped VeeRuby delivery so both surfaces stay clear.
Applied lessons keep production and client patterns abstract enough to be public while still useful for future architecture decisions.
If systems forget context between sessions, users repeat orientation work. That is a system design problem, not a UI polish issue.
XR + AI systems need latency thinking across sensing, inference, orchestration, and rendering before demo polish hides the problem.
Selected Signals
Only the signals that strengthen the current research and prototype story belong here.
Sensors, IEEE INFOCOM Workshops, and ICDT give the lab real technical lineage instead of pure founder storytelling.
This is the only award shown here because it has a clean external receipt and ties directly to the research lineage.
The published mixed reality book is the stronger public signal. Small Teams, Strong Systems is now framed as an in-review systems manuscript.
Texas State research, Ong product work, and VeeRuby delivery are the trail that explains why Harmony exists at all.
Research + Draft Papers
Harmony focuses on persistent user models, contextual spatial memory, and explainable AI mediation. The lab should show drafts, figures, experiments, and implementation evidence when they are useful.
GitHub Signal
There is real repo activity across public work, private systems, internal docs, and in-progress builds. The site should show the map without overstating polish.
The core research and platform layer spans HarmonyXR_GazePoseContext, system design work, and experimental XR + AI prototypes for multimodal context inference and practical adaptive behavior testing.
These repos support AI command centers, internal operating systems, and the sites themselves rather than standalone public libraries.
Some repos are not product code. They hold books, documentation, personal operating structures, and knowledge architecture that feed the broader lab.
This bucket covers company work, immersive products, vertical demos, and partner-facing builds. Useful signal, but not the same thing as reusable open-source infrastructure.
Demo Library
The selected tab keeps the page compact. Category filters keep the wider public demo trail available.
AI-assisted VR education environment and one of the clearest bridges between your XR history and current AI direction.
Interactive chemistry learning in AR.
Medical visualization in immersive space.
Mixed reality anatomy exploration.
Testing surgical workflow concepts in XR.
Urban-scale spatial review in XR.
Mixed reality support for site inspection and planning.
Property visualization with Unity, ARKit, and Azure Spatial Anchors.
Hands-on welding training for Meta Quest 2.
Immersive training for mining safety procedures.
Virtual reality driving training on Quest.
Collaborative engineering design review in AR.
Virtual collaboration room built for Quest.
Remote holographic assistance on HoloLens 2.
Spatial computing for renewable energy infrastructure planning.
3D globe and geospatial data visualization in mixed reality.
Commercial Translation
The lab keeps reusable thinking public. VeeRuby carries scoped projects, enterprise delivery, and client-facing XR + AI outcomes.
Boundary
Some ideas from this lab can become client-ready systems through VeeRuby: XR training, mixed reality apps, spatial AI prototypes, Unity/Unreal/WebXR builds, and productized pilots.
Books + Manuscripts
The published Apress book is a receipt. The systems manuscript is included as an in-review thread, not as a finished credential.
Published technical book on hands-on mixed reality implementation.
Open Springer listing
In-review systems thinking manuscript for builders, managers, and founders working under real constraints.
Open Amazon listing