Agent Context Optimization (ACON): Optimizing Context Compression for Long-horizon LLM Agents

October 14, 2025 ยท View on GitHub

Alt text

Python 3.11+ MIT License arXiv GitHub stars

acon is a research framework for optimizing the context compression for long-horizon LLM agents, focusing on minimizing redundant memory growth while preserving essential information for decision-making.

It provides standardized pipelines for environments, agents, context compression, and distillation (compressor and agent) across multiple realistic benchmarks such as AppWorld, OfficeBench, and 8-objective QA.

This repository contains the official implementation of the paper:

ACON: Optimizing Context Compression for Long-horizon LLM Agents
Minki Kang, Wei-Ning Chen, Dongge Han, Huseyin A. Inan, Lukas Wutschitz, Yanzhi Chen, Robert Sim, Saravan Rajmohan

If you find our work useful, please cite our work:

@misc{kang2025aconoptimizingcontextcompression,
      title={ACON: Optimizing Context Compression for Long-horizon LLM Agents}, 
      author={Minki Kang and Wei-Ning Chen and Dongge Han and Huseyin A. Inan and Lukas Wutschitz and Yanzhi Chen and Robert Sim and Saravan Rajmohan},
      year={2025},
      eprint={2510.00615},
      archivePrefix={arXiv},
      primaryClass={cs.AI},
      url={https://arxiv.org/abs/2510.00615}, 
}

Table of Contents

๐Ÿš€ Quickstart (AppWorld)

Install AppWorld environment (details in AppWorld README).

git lfs install
git clone https://github.com/StonyBrookNLP/appworld
cd appworld
pip install -e .
appworld install --repo
appworld download data

Run AppWorld agent with the history compression:

git clone https://github.com/microsoft/acon.git
mv /path/to/appworld/data /path/to/acon/experiments/appworld
cd acon
pip install -e .
# Place the openai API key in `configs/private_config.yaml`.

cd experiments/appworld
python run_all.py \
    --split train \
    --model_name gpt-4.1-mini \
    --tag baseline \
    --co_config_path configs/context_opt/gpt-4.1-mini_history.yaml

Results will be saved in:

experiments/appworld/outputs/gpt-4.1-mini_baseline/

Do you want to optimize the compression guideline and distill to a local model? See below!

๐Ÿ› ๏ธ Installation

Prerequisites

  • Python 3.11+

Basic Installation

git clone https://github.com/microsoft/acon.git
cd acon
pip install -e .

Configuration

Place your OpenAI API key in configs/private_config.yaml:

openai_key: "your_api_key_here"

๐Ÿ“š Repository Structure

acon/
โ”œโ”€โ”€ configs/                # API config
โ”œโ”€โ”€ experiments/            # benchmark runners (AppWorld, OfficeBench, QA) & utils for fine-tuning and prompt optimization
โ”œโ”€โ”€ src/productive_agents/  # implementations for environments, agents, and context compressors
โ””โ”€โ”€ README.md

๐Ÿ“Š Benchmarks

We currently support three benchmark families:

BenchmarkDescriptionFolder
AppWorldDay-to-day personal task workflowsexperiments/appworld
OfficeBenchOffice productivity automationexperiments/officebench
8-objective QALightweight reasoning & retrieval tasksexperiments/smolagents

All benchmarks follow the same experimental pipeline:

  1. Run baseline experiments with GPT models
  2. Optimize context compression guidelines
  3. Distillation Stage 1 (Compressor LoRA)
  4. Distillation Stage 2 (Agent LoRA)

For detailed per-benchmark instructions, please refer to:

Contributing

This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit Contributor License Agreements.

When you submit a pull request, a CLA bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., status check, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.

This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.

Trademarks

This project may contain trademarks or logos for projects, products, or services. Authorized use of Microsoft trademarks or logos is subject to and must follow Microsoft's Trademark & Brand Guidelines. Use of Microsoft trademarks or logos in modified versions of this project must not cause confusion or imply Microsoft sponsorship. Any use of third-party trademarks or logos are subject to those third-party's policies.