Skip to content
/ DMWM Public

This is a repository for the paper "DMWM: Dual-Mind World Model with Long-Term Imagination" accepted by NeurIPS 2025

License

Notifications You must be signed in to change notification settings

news-vt/DMWM

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

12 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

DMWM

This is an official repository for the paper "DMWM: Dual-Mind World Model with Long-Term Imagination" accepted by NeurIPS 2025.

The proposed dual-mind world model

📝 Installation

You can create and activate the environment as follows:

conda create -n dmwm python==3.7
conda activate dmwm
pip install -r requirements.txt

Suggested GPU: All experiments in the paper were conducted on a single NVIDIA RTX 3090 GPU. We also tried the NVIDIA RTX 3080 GPU, which can also work.

Training Env: Google DeepMind Infrastructure for Physics-Based Simulation.

🚀 Training

To train the model(s) in the paper, run this command: Taking the "walker-walk" task as an example:

python main.py --algo dreamer --env walker-walk --action-repeat 2 --id your_named-experiement

Some useful commands:

python main.py --algo dreamer --env walker-walk --action-repeat 2 --logic-overshooting-distance 10 --id your_named-experiement
python main.py --algo dreamer --env walker-walk --action-repeat 2 --planning-horizon 50 --id your_named-experiement
python main.py --algo dreamer --env walker-walk --action-repeat 2 --planning-horizon 50 --logic-overshooting-distance 50 --id your_named-experiement

🌈 Evaluation

To evaluate my model on control tasks, run:

python main.py --models saved_path --test

❤️ Acknowledgement

Our implementation is based on Dreamer (for System 1) and Logic-Integrated Neural Network (LINN) (as the basic framework with the proposed deep logical inference and automatic logic learning from environment dynamics for System 2). Thanks for their great open-source work!

📄 License

All content in this repository is under the MIT license.

⭐ Citation

If any parts of our paper and code help your research, please consider citing us and giving a star to our repository.

@article{wang2025dmwm,
  title={DMWM: Dual-Mind World Model with Long-Term Imagination},
  author={Wang, Lingyi and Shelim, Rashed and Saad, Walid and Ramakrishnan, Naren},
  journal={arXiv preprint arXiv:2502.07591},
  year={2025}
}

👍 Some Test Results

High Data Efficiency and Robust Planning Over Extended Horizon Size: "Data Efficiency" "Data Efficiency" "Robust Planning Over Extended Horizon Size"

About

This is a repository for the paper "DMWM: Dual-Mind World Model with Long-Term Imagination" accepted by NeurIPS 2025

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages