PCEG: Prior-Constrained Explorative Guidance for Generalization in Diffusion Motion Planning
Abstract
Diffusion-based planners have improved generalization via guidance, which integrates classical planning principles into learning-based methods. By leveraging inference-time optimization, these methods have achieved generalization comparable to classical planners. However, their limited ability to capture environmental variations often constrains their responsiveness in unfamiliar settings. In addition, the diversity-consistency trade-off observed for guidance in image domains has remained unresolved. In this work, we propose Prior-Constrained Explorative Guidance (PCEG), a novel approach that gathers environmental information through local exploration and prevents guided samples from converging prematurely to similar solutions by leveraging a trajectory prior. The collected information is included in the guidance via stochastic gradient estimation, while a succinct parameter scheduling strategy enables latent optimization driven by environmental signals without significant computational overhead. Furthermore, during the modal-seeking stages of the reverse diffusion process, we employ a Gaussian Process (GP) to enforce dynamics-informed priors, effectively constraining the exploration region of each sample and thereby enhancing solution diversity. Across diverse benchmarks including 7-degree-of-freedom (7-DoF) robot-arm manipulation, PCEG substantially improves success rates by up to 30 percentage points compared to competitive diffusion planners without compromising trajectory quality, even in scenarios involving unseen obstacles. Real-world experiments further validate these findings, showcasing the generation of smooth, collision-free trajectories in novel environments.
Proposed Framework
Qualitative Results
Simple2D
Narrow2D
Dense2D
Sphere3D with Franka Panda
Qualitative Results : Computatonally Light Version
Simple2D
Narrow2D
Dense2D
Sphere3D with Franka Panda
First-Seen Env
Cube3D case1
Cube3D case2
Cube3D with a Big Grasped Object case1
Cube3D with a Big Grasped Object case2
Cube3D with a Grasped Object
First-Seen Env : Computationally Light Version
Cube3D case1
Cube3D case2
Cube3D with a Big Grasped Object case1
Cube3D with a Big Grasped Object case2
Cube3D with a Grasped Object
First-Seen Env : Real World
PCEG case1
PCEG-s-DDIM case1
PCEG case2
PCEG-s-DDIM case2
PCEG case3
PCEG-s-DDIM case3
PCEG case3 in sparse obstacles
PCEG-s-DDIM case3 in sparse obstacles
BibTeX
@article{Kim2026PriorConstrained,
title={Prior-Constrained Explorative Guidance
for Generalization in Diffusion Motion Planning},
author={Kim, Sunhwi and Kim, Junsu and Baek, Seungjae and Shin, Jaechan and Lee, Jungeun and Lee, Seongjae and Joo, Kyungdon and Jeon, Jeong hwan},
booktitle={IEEE International Conference on Robotics and Automation (ICRA)},
year={2026},
note={Accepted}
}