Researchers have developed AIRIS (Autonomous Intelligent Reinforcement Inferred Symbolism), an AI capable of navigating and adapting in Minecraft. AIRIS starts with no prior knowledge and learns through a feedback loop, using a 5 x 5 x 5 3D grid for visual input and its current coordinates for spatial awareness. It can perform 16 distinct movement actions and begins in "Free Roam" mode to explore and map its environment. The AI learns from obstacles and aims to fill in its internal map. Future enhancements may include actions like placing blocks, gathering resources, and combat. AIRIS can switch to targeted navigation mode when given specific coordinates, allowing it to explore previously unseen areas. Its combination of free exploration and targeted navigation sets it apart from traditional Reinforcement Learning models.