MPCViz is a lightweight RViz2 front-end for inspecting receding-horizon control (RHC) plans and measured robot states in real time. The library focuses solely on visualization: you decide how to produce and publish data, MPCViz subscribes to a well-defined set of topics, and RViz renders the planner horizon, contacts, and reference trajectories.
- ROS 2 (tested on Humble and Jazzy; both ship RViz2)
- PyYAML
- NumPy
- urdf_parser_py
- Handshake –
mpc_viz.utils.handshake.MPCVizHandshakeannounces how many shooting nodes your optimizer exposes. MPCViz waits for this before spawning TF frames. - Topic contract – topic names are generated via
mpc_viz.utils.namings.NamingConventions. Required streams (all ROS 2Float64MultiArrayorString) are:/MPCViz_<ns>_HandShake: contains[n_nodes]./MPCViz_<ns>_robot_actuated_jointnames//MPCViz_<ns>_rhc_actuated_jointnames: semicolon-separated joint names encoded withmpc_viz.utils.string_list_encoding.StringArray./MPCViz_<ns>_robot_q:[root_pose(7); joint_positions]for the measured robot./MPCViz_<ns>_rhc_q: stacked horizon states, where each node repeats the same layout as the robot state.- Optional
/MPCViz_<ns>_rhc_refs,/MPCViz_<ns>_hl_refs,/MPCViz_<ns>_rhc_contacts,/clock, etc. add overlays such as reference twists, high-level goals, and contact wrenches.
- RViz client –
mpc_viz.MPCVizloads your URDF, starts RViz2 with a template configuration (seempc_viz/cfg), and publishes the TF tree and markers for every horizon node. Thenodes_percargument lets you down-sample long horizons to keep the scene readable.
Because the interface is ROS-topic based, integrating your own controller only requires a bridge that maps its internal data structures into the topics above.
The mpc_viz/tests folder contains fully working dummy publishers implemented with rclpy. They publish synthetic data for two publicly available robot descriptions and have been verified on ROS 2 Humble and Jazzy:
Steps with Centauro ( for Aliengo just use --robot_type aliengo and the right description path: $PATH_TO_ALIENGO_REPO/unitree_ros/robots/aliengo_description/aliengo_urdf):
# Terminal 1 – start the dummy publishers (server) for state, nodes and references
source /opt/ros/humble/setup.bash
python3 mpc_viz/tests/dummy_publisher_all.py \
--robot_type centauro \
--n_rhc_nodes 10
# Terminal 2 – launch MPCViz (client, generates a URDF automatically)
source /opt/ros/humble/setup.bash
python3 mpc_viz/tests/test_rhcviz.py \
--dpath "$PATH_TO_CENTAURO_REPO/iit-centauro-ros-pkg/centauro_urdf" \
--robot_type centauro
test_rhcviz.py builds a URDF via RoboUrdfGen, then runs MPCViz with relaxed joint-name checking so that it accepts the dummy streams. dummy_publisher_all.py spawns four scripts (dummy_handshaker.py, dummy_rhc_state_publisher.py, dummy_robot_state_publisher.py, dummy_rhc_ref_state_publisher.py) that follow the exact topic contract, making them ideal templates for your own bridge. When running your server, you can simply put all publishers in a single process/node.
- Advertise handshake data early: instantiate
MPCVizHandshake(..., is_server=True)and periodically callset_n_nodes(n)so the client knows how many nodes to expect. - Publish joint names once: send the URDF joint ordering using
StringArray.encode. This lets MPCViz match vector entries to robot joints. - Stream horizon states: flatten an
[n_state x n_nodes]matrix into aFloat64MultiArray. The base pose should be[x, y, z, qx, qy, qz, qw]per node, followed by commanded joint positions. - Stream measured robot state: same format, but only one node.
- Optional overlays: publish references (
_rhc_refs,_hl_refs), contact wrenches (_rhc_contacts), or/clockfor bagging/synchronization. - Tune visualization: pass
nodes_percto MPCViz to sub-sample horizons, or disable joint-name checking viacheck_jnt_names=Falseif your bridge guarantees correct ordering.
Looking for a full-stack project that already integrates MPCViz? Check out IBRIDO, which combines reinforcement learning and model-predictive controllers while reusing MPCViz for visualization.