autonomous learning

AppWizard
November 11, 2024
Researchers have developed AIRIS (Autonomous Intelligent Reinforcement Inferred Symbolism), an AI capable of navigating and adapting in Minecraft. AIRIS starts with no prior knowledge and learns through a feedback loop, using a 5 x 5 x 5 3D grid for visual input and its current coordinates for spatial awareness. It can perform 16 distinct movement actions and begins in "Free Roam" mode to explore and map its environment. The AI learns from obstacles and aims to fill in its internal map. Future enhancements may include actions like placing blocks, gathering resources, and combat. AIRIS can switch to targeted navigation mode when given specific coordinates, allowing it to explore previously unseen areas. Its combination of free exploration and targeted navigation sets it apart from traditional Reinforcement Learning models.
AppWizard
November 7, 2024
The ASI Alliance has introduced AIRIS, an AI system that learns autonomously within Minecraft, representing a significant step towards artificial general intelligence (AGI). AIRIS, which stands for Autonomous Intelligent Reinforcement Inferred Symbolism, is designed to adapt independently in a virtual environment, showcasing dynamic learning capabilities. Developed in collaboration with partners like SingularityNET, Fetch.ai, and Ocean Data, AIRIS utilizes a robust infrastructure to efficiently store and process information. It features real-time environmental adaptation, allowing it to navigate complex terrains and develop its own rules based on observations. AIRIS also aims to cultivate advanced interaction skills, including construction and object creation, potentially enabling collaborative tasks with other agents. The ultimate goal is to apply AIRIS in real-world scenarios, such as autonomous robots and intelligent assistants, across various sectors requiring adaptive learning and problem-solving abilities.
Search