Writing in the Journal of Neural Engineering, they said the work could help understanding of how babies learn to walk – and spinal-injury treatment.
They created a version of the message system that generates the rhythmic muscle signals that control walking.
A UK expert said the work was exciting because the robot mimics control and not just movement.
The team, from the University of Arizona, were able to replicate the central pattern generator (CPG) – a nerve cell (neuronal) network in the lumbar region of the spinal cord that generates rhythmic muscle signals.
The CPG produces, and then controls, these signals by gathering information from different parts of the body involved in walking, responding to the environment.
This is what allows people to walk without thinking about it.
The simplest form of a CPG is called a half-centre, which consists of just two neurons that fire signals alternatively, producing a rhythm, as well as sensors that deliver information, such as when a leg meets a surface, back to the half-centre.
The University of Arizona team suggests babies start off with this simplistic set-up – and then over time develop a more complex walking pattern.
They say this could explain why babies put onto a treadmill have been seen to take steps – even before they have learnt to walk.