Conference Paper

Smoothed Motion Complexity.

Conference: Algorithms - ESA 2003, 11th Annual European Symposium, Budapest, Hungary, September 16-19, 2003, Proceedings
Source: DBLP
0 Bookmarks
 · 
97 Views
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Binary search trees are one of the most fundamental data structures. While the height of such a tree may be linear in the worst case, the average height with respect to the uniform distribution is only logarithmic. The exact value is one of the best studied problems in average-case complexity.We investigate what happens in between by analysing the smoothed height of binary search trees: Randomly perturb a given (adversarial) sequence and then take the expected height of the binary search tree generated by the resulting sequence. As perturbation models, we consider partial permutations, partial alterations, and partial deletions.On the one hand, we prove tight lower and upper bounds of roughly for the expected height of binary search trees under partial permutations and partial alterations, where n is the number of elements and p is the smoothing parameter. This means that worst-case instances are rare and disappear under slight perturbations. On the other hand, we examine how much a perturbation can increase the height of a binary search tree, i.e. how much worse well balanced instances can become.
    Theoretical Computer Science. 01/2007;
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: We study the complexity of the visibility map of terrains whose triangles are fat, not too steep and have roughly the same size. It is known that the complexity of the visibility map of such a terrain with n triangles is Θ(n2) in the worst case. We prove that if the elevations of the vertices of the terrain are subject to uniform noise which is proportional to the edge lengths, then the worst-case expected (smoothed) complexity is only Θ(n). We also prove non-trivial bounds for the smoothed complexity of instances where some triangles do not satisfy the above properties. Our results provide an explanation why visibility maps of superlinear complexity are unlikely to be encountered in practice.
    Journal of Computational Geometry. 01/2010;
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Many algorithms and heuristics work well on real data, de- spite having poor complexity under the standard worst-case measure. Smoothed analysis (36) is a step towards a the- ory that explains the behavior of algorithms in practice. It is based on the assumption that inputs to algorithms are subject to random perturbation and modification in their formation. A concrete example of such a smoothed analysis is a proof that the simplex algorithm for linear programming usually runs in polynomial time, when its input is subject to modeling or measurement noise.
    Commun. ACM. 01/2009; 52:76-84.

Full-text (2 Sources)

View
9 Downloads
Available from
May 21, 2014