A simple model for loss created by the rip clearance flow in axial compressors is presented, based on an experimental program performed in conjunction with the Dawes three-dimensional Navier-Stokes calculation method. The principal mechanism of loss (entropy creation) caused by tip leakage flow has been established to be the mixing of flows of similar speeds but different direction. Calculations show that relative motion of the endwall relative to the rip has a small effect on clearance flow. The simple model correctly predicts the magnitude of tip clearance loss and the trend with changes of tip clearance for the cascade tested. For a given geometry the loss is almost exactly proportional to the ratio of tip clearance to blade span; the loss directly associated with the clearance is smaller than often assumed. The simple model for tip clearance loss has been expressed in terms of conventional nondimensional design variables (for example: solidify, aspect ratio, flow coefficient, loading coefficient) and from these the contribution to the overall loss of efficiency caused by rip leakage flow is conveniently represented. The trends are illustrated far a number of possible compressor design choices. Blade row loss increases more slowly than blade loading (for example, diffusion factor). As a result the decrement in stage efficiency associated with clearance flow decreases as the stage loading is raised in the practical range of flow and loading coefficients.