Play Any Game with Body Gestures uses full-body motion tracking to let players control games through physical movements, creating an immersive and controller-free gaming experience. It leverages AR frameworks and pose estimation models to map user body movements—like jumping, crouching, or leaning—into real-time game commands. This innovation blends physical activity with digital gameplay, enhancing user engagement, accessibility, and fun across various gaming platforms.
Watch Video
This project aims to revolutionize how people interact with digital games by replacing traditional input devices with body-based gesture controls. It uses computer vision and depth sensing to track full-body motion and translate actions into in-game interactions. The system is suitable for fitness games, educational simulations, and immersive entertainment experiences where active play is encouraged. Its purpose is to promote intuitive, active, and immersive gameplay while offering new levels of interactivity in AR and traditional gaming environments.
Detects and analyzes body posture, joint positions, and movement patterns in real-time. Allows players to use natural body gestures to interact with gameplay environments.
Uses AI models to classify and map physical gestures like walking, jumping, and pointing into game controls. Supports customizable action mapping for diverse game genres.
Removes the need for handheld controllers, making gameplay more immersive and accessible. Ideal for fitness, dance, sports, and interactive AR experiences.
Provides augmented overlays or visual guides during gameplay to help users perform gestures correctly. Improves learning and responsiveness with on-screen feedback loops.
Compatible with web-based, desktop, and AR headset platforms using standard APIs and SDKs. Easily integrates with engines like Unity or Unreal for cross-platform deployment.
Feel free to reach us at