Moving a sensor through its environment creates signature time variations of the sensor's readings often referred to as flow cues. We analyze the acoustic flow field generated by a sonar sensor, capable of imaging the full frontal hemisphere, mounted on a mobile platform. We show how the cues derived from this acoustic flow field can be used directly in a layered control strategy which supports a robotic platform to perform a set of motion primitives such as obstacle avoidance, corridor following and negotiating corners and T-junctions. The programmable nature of the spatial sampling pattern of the sonar allows efficient support of the varying information requirements of the different motion primitives. The proposed control strategy is first validated in a simulated environment and subsequently transferred to a real mobile robot. We present simulated and experimental results on the controller's performance while executing the different motion primitives. The results further show that the proposed control strategy can easily integrate minimal steering commands given by a user (electric wheelchair application) or by a high-level navigation module (autonomous SLAM applications).