LLM-Planner: Few-Shot Grounded Planning for Embodied Agents with Large Language Models

March 18, 2025 · View on GitHub

Code for LLM-Planner.

Check project website for an overview and a demo.

What's Here

  • hlp/: A high-level prompt generator and kNN dataset from our paper. Just bring your low-level controller (and an LLM)!
  • e2e/: A end-to-end agent that uses a LLM-Planner.

Quickstart

Check out the hlp/ and e2e/ README for more details.

Implementation Examples

We provide examples of how the community has been using our work. We appreciate everyone's interest!

Acknowledgements

We thank OSUNLP for providing valuable feedback and suggestions.

License

  • LLM-Planner - MIT License

Contact

Questions or issues? File an issue or contact Luke Song

Citation Information

@InProceedings{song2023llmplanner,
  author    = {Song, Chan Hee and Wu, Jiaman and Washington, Clayton and Sadler, Brian M. and Chao, Wei-Lun and Su, Yu},
  title     = {LLM-Planner: Few-Shot Grounded Planning for Embodied Agents with Large Language Models},
  booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV)},
  month     = {October},
  year      = {2023},
}