Papers
Topics
Authors
Recent
Search
2000 character limit reached

Unraveling the ARC Puzzle: Mimicking Human Solutions with Object-Centric Decision Transformer

Published 14 Jun 2023 in cs.AI and cs.LG | (2306.08204v1)

Abstract: In the pursuit of artificial general intelligence (AGI), we tackle Abstraction and Reasoning Corpus (ARC) tasks using a novel two-pronged approach. We employ the Decision Transformer in an imitation learning paradigm to model human problem-solving, and introduce an object detection algorithm, the Push and Pull clustering method. This dual strategy enhances AI's ARC problem-solving skills and provides insights for AGI progression. Yet, our work reveals the need for advanced data collection tools, robust training datasets, and refined model structures. This study highlights potential improvements for Decision Transformers and propels future AGI research.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (31)
  1. Communicating Natural Programs to Humans and Machines. In NeurIPS, 2022.
  2. An Approach for Solving Tasks on the Abstract Reasoning Corpus. arXiv:2302.09425, 2023.
  3. Global Overview of Imitation Learning. arXiv:1801.06503, 2018.
  4. Dreaming with ARC. In NeurIPS Workshop on Learning Meets Combinatorial Algorithms, 2020.
  5. Borji, A. A Categorical Archive of ChatGPT failures. arXiv:2302.03494, 2023.
  6. Language Models are Few-Shot Learners. In NeurIPS, 2020.
  7. End-to-End Object Detection with Transformers. In ECCV, 2020.
  8. Decision Transformer: Reinforcement Learning via Sequence Modeling. In NeurIPS, 2021.
  9. Rethinking Atrous Convolution for Semantic Image Segmentation. arXiv:1706.05587, 2017.
  10. Chollet, F. On the Measure of Intelligence. arXiv:1911.01547, 2019.
  11. RandAugment: Practical Automated Data Augmentation with a Reduced Search Space. In NeurIPS, 2020.
  12. An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale. In ICLR, 2021.
  13. Dreamcoder: Bootstrapping Inductive Program Synthesis with Wake-Sleep Library Learning. In PLDI, 2021.
  14. A Density-Based Algorithm for Discovering Clusters in Large Spatial Databases with Noise. In KDD, 1996.
  15. Fast and Flexible: Human Program Induction in Abstract Reasoning Tasks. In CogSci, 2021.
  16. Deeply-Recursive Convolutional Network for Image Super-Resolution. In CVPR, 2016.
  17. Playgrounds for Abstraction and Reasoning. In NeurIPS Workshop on Neuro Causal and Symbolic AI, 2022.
  18. Building Machines that Learn and Think like People. Behavioral and Brain Sciences, 40, 2017.
  19. Feature Pyramid Networks for Object Detection . In CVPR, 2017.
  20. Object-Centric Learning with Slot Attention. In NeurIPS, 2020.
  21. Melo, L. C. Transformers are Meta-Reinforcement Learners. In ICML, 2022.
  22. The ConceptARC benchmark: Evaluating Understanding and Generalization in the ARC Domain. arXiv:2305.07141, 2023.
  23. Learning Compositional Rules via Neural Program Synthesis. In NeurIPS, 2020.
  24. Evaluating Understanding on Conceptual Abstraction Benchmarks. arXiv:2206.14187, 2022.
  25. You Only Look Once: Unified, Real-Time Object Detection. In CVPR, 2016.
  26. Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks. In NeurIPS, 2015.
  27. Structured World Representations Via Block Slot Attention. In ICLR, 2023.
  28. Behavioral Cloning from Observation. In IJCAI, 2018.
  29. Prompting Decision Transformer for Few-Shot Policy Generalization. In ICML, 2022.
  30. Graphs, Constraints, and Search for the Abstraction and Reasoning Corpus. In AAAI, 2023.
  31. LIMA: Less is More for Alignment. arXiv:2305.11206, 2023.
Citations (11)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.