GitHub: noahpro99/3Dera Devpost: 3Dera on Devpost Demo: YouTube walkthrough

3Dera project preview

A hackathon project focused on making history learning immersive: enter AI-generated scenes in VR while listening to narrated context about the event. The AI 3D rendering happens on-the-fly based on the selected historical topic.

[vr, ai, education, 3d]

Why it stands out

  • Won overall at HooHacks 2024 (as noted in the project README).
  • Generates scenes from prompts instead of relying on pre-made fixed levels.
  • 3D scenes are generated live from historical context rather than pre-rendered manually.
  • Combines 3D scene generation with contextual narration for learning.

How it works

  1. Event understanding: uses GPT-4 to interpret a historical event and decide scene context.
  2. Scene construction: pulls objects from Objaverse and places/scales them in Blender with scripted coordinates.
  3. Environment generation: uses Hugging Face APIs for skybox and ground texture generation.
  4. Learning layer: uses Wikipedia + GPT-4 summarization to generate audio narration for the scene.

Technical Stack

  • Frontend: React/TypeScript
  • Backend: Python + FastAPI
  • 3D pipeline: Blender scripting + Objaverse assets
  • AI services: OpenAI, Hugging Face, Wikipedia API