Papers
arxiv:2212.04088

LLM-Planner: Few-Shot Grounded Planning for Embodied Agents with Large Language Models

Published on Dec 8, 2022
Authors:
,
,
,
,
,

Abstract

This study focuses on using <PRE_TAG>large language models (LLMs)</POST_TAG> as a planner for <PRE_TAG>embodied agents</POST_TAG> that can follow natural language instructions to complete complex tasks in a visually-perceived environment. The high data cost and poor sample efficiency of existing methods hinders the development of versatile agents that are capable of many tasks and can learn new tasks quickly. In this work, we propose a novel method, <PRE_TAG>LLM-Planner</POST_TAG>, that harnesses the power of large language models to do <PRE_TAG>few-shot planning</POST_TAG> for <PRE_TAG>embodied agents</POST_TAG>. We further propose a simple but effective way to enhance LLMs with <PRE_TAG>physical grounding</POST_TAG> to generate and update plans that are grounded in the current environment. Experiments on the <PRE_TAG>ALFRED dataset</POST_TAG> show that our method can achieve very competitive few-shot performance: Despite using less than 0.5% of paired training data, <PRE_TAG>LLM-Planner</POST_TAG> achieves competitive performance with recent baselines that are trained using the full training data. Existing methods can barely complete any task successfully under the same few-shot setting. Our work opens the door for developing versatile and sample-efficient <PRE_TAG>embodied agents</POST_TAG> that can quickly learn many tasks. Website: https://dki-lab.github.io/<PRE_TAG>LLM-Planner</POST_TAG>

Community

Sign up or log in to comment

Models citing this paper 0

No model linking this paper

Cite arxiv.org/abs/2212.04088 in a model README.md to link it from this page.

Datasets citing this paper 0

No dataset linking this paper

Cite arxiv.org/abs/2212.04088 in a dataset README.md to link it from this page.

Spaces citing this paper 0

No Space linking this paper

Cite arxiv.org/abs/2212.04088 in a Space README.md to link it from this page.

Collections including this paper 0

No Collection including this paper

Add this paper to a collection to link it from this page.