Probabilistic analysis of algorithms
Encyclopedia
In analysis of algorithms
Analysis of algorithms
To analyze an algorithm is to determine the amount of resources necessary to execute it. Most algorithms are designed to work with inputs of arbitrary length...

, probabilistic analysis of algorithms is an approach to estimate the computational complexity
Computational Complexity
Computational Complexity may refer to:*Computational complexity theory*Computational Complexity...

 of an algorithm
Algorithm
In mathematics and computer science, an algorithm is an effective method expressed as a finite list of well-defined instructions for calculating a function. Algorithms are used for calculation, data processing, and automated reasoning...

 or a computational problem. It starts from an assumption about a probabilistic distribution of the set of all possible inputs. This assumption is then used to design an efficient algorithm or to derive the complexity of a known algorithm.

This approach is not the same as that of probabilistic algorithms, but the two may be combined.

For non-probabilistic, more specifically, for deterministic algorithm
Deterministic algorithm
In computer science, a deterministic algorithm is an algorithm which, in informal terms, behaves predictably. Given a particular input, it will always produce the same output, and the underlying machine will always pass through the same sequence of states...

s, the most common types of complexity estimates are
  • the average-case complexity
    Average-case complexity
    Average-case complexity is a subfield of computational complexity theory that studies the complexity of algorithms on random inputs.The study of average-case complexity has applications in the theory of cryptography....

     (expected time complexity), in which given an input distribution, the expected
    Expected
    Expected may refer to:*Expectation*Expected value*Expected shortfall*Expected utility hypothesis*Expected return*Expected gainSee also*Unexpected...

     time of an algorithm is evaluated
  • the almost always complexity estimates, in which given an input distribution, it is evaluated that the algorithm admits a given complexity estimate that almost surely
    Almost surely
    In probability theory, one says that an event happens almost surely if it happens with probability one. The concept is analogous to the concept of "almost everywhere" in measure theory...

     holds.

Probabilistic Algorithms

In probabilistic analysis of probabilistic (randomized) algorithms, the distributions or averaging for all possible choices in randomized steps are also taken into an account, in addition to the input distributions.

See also

  • Amortized analysis
    Amortized analysis
    In computer science, amortized analysis is a method of analyzing algorithms that considers the entire sequence of operations of the program. It allows for the establishment of a worst-case bound for the performance of an algorithm irrespective of the inputs by looking at all of the operations...

  • Average-case complexity
    Average-case complexity
    Average-case complexity is a subfield of computational complexity theory that studies the complexity of algorithms on random inputs.The study of average-case complexity has applications in the theory of cryptography....

  • Best, worst and average case
    Best, worst and average case
    In computer science, best, worst and average cases of a given algorithm express what the resource usage is at least, at most and on average, respectively...

  • Random self-reducibility
    Random Self-reducibility
    Random self-reducibility is the rule that a good algorithm for the average case implies a good algorithm for the worst case. RSR is the ability to solve all instances of a problem by solving a large fraction of the instances.-Definition:...

The source of this article is wikipedia, the free encyclopedia.  The text of this article is licensed under the GFDL.
 
x
OK