In Katz's Introduction to Modern Cryptography, there are several hard problems, and for each problem, there is an experiment, where an algorithm generates a problem instance, and another algorithm solves the problem instance. For example, consider the discrete-logarithm problem in 9.3.2:
Let GG be a group generating algorithm which for each n, generates (G,q,p).
Let A be an algorithm for solving the problem.
The discrete-logarithm experiment DLog_{A,GG} (n):
- Run GG(1^n ) to obtain (G , q, g), where G is a cyclic group of order q (with kqk = n), and g is a generator of G .
- Choose a uniform h in G .
- A is given G , q, g, h, and outputs x in Z_q .
- The output of the experiment is defined to be 1 if g^x = h, and 0 otherwise.
DEFINITION 9.63 We say the discrete-logarithm problem is hard relative to GG if for all probabilistic polynomial-time algorithms A there exists a negligible function negl such that Pr[DLog
A,GG(n) = 1] = negl(n).
Is that kind of definition and analysis of hardness of a problem, involving the concept of "experiment", standard to complexity analysis of problems in general (not just in cryptography)?
Do books on complexity analysis of problems use this kind of definition and analysis of hardness of a problem?
Thanks.