Complexity theory

Acronym
N/A

Alternate name(s)
Computational Complexity Theory

Main dependent construct(s)/factor(s)
Time (Time Complexity), Space (Memory), Parallel Processors

Main independent construct(s)/factor(s)
Size of the Input

Concise description of theory
Complexity theory is part of the theory of computation dealing with the resources required during computation to solve a given problem. The most common resources are time (how many steps it takes to solve a problem) and space (how much memory it takes). Other resources can also be considered, such as how many parallel processors are needed to solve a problem in parallel. Complexity theory differs from computability theory, which deals with whether a problem can be solved at all, regardless of the resources required. After the theory explaining which problems can be solved and which cannot be, it was natural to ask about the relative computational difficulty of computable functions. This is the subject matter of computational complexity.

The time complexity of a problem is the number of steps that it takes to solve an instance of the problem as a function of the size of the input (usually measured in bits), using the most efficient algorithm. The exact number of steps will depend on exactly what machine or language is being used.

Much of complexity theory deals with decision problems. A decision problem is a problem where the answer is always YES/NO. Decision problems are often considered because an arbitrary problem can always be reduced to a decision problem.

The complexity class P is the set of decision problems that can be solved by a deterministic machine in polynomial time. This class corresponds to an intuitive idea of the problems which can be effectively solved in the worst cases.

The complexity class NP is the set of decision problems that can be solved by a non-deterministic machine in polynomial time. This class contains many problems that people would like to be able to solve effectively, including the Boolean satisfiability problem, the Hamiltonian path problem and the Vertex cover problem. All the problems in this class have the property that their solutions can be checked effectively.

An important result in complexity theory is the fact that no matter how hard a problem can get (i.e. how much time and space resources it requires); there will always be even harder problems. At least for time complexity, and for polynomial-time decision problems, this is determined by the time hierarchy theorem. A similar space hierarchy theorem can also be derived.

Sources: http://en.wikipedia.org/wiki/Computational_complexity_theory, http://www.absoluteastronomy.com/encyclopedia/c/co/computational_complexity_theory.htm

Diagram/schematic of theory
N/A

Originating author(s)
Juris Hartmanis, Richard Stearns

Seminal articles
Cook, Stephen A. May 1971. The complexity of theorem proving procedures. Proceedings Third Annual ACM Symposium on Thoery of Computing: 151-158.

Goldreich, Oded, Goldwasser, Shafi, & Micali, Silvio. 1984. How to construct random functions. Journal of the ACM, 33(4): 792-807.

Goldwasser, Shafi, Micali, Silvio, & Rackoff, Charles. February 1989. The knowledge complexity of interactive proof systems. SIAM Journal of Computing, 18(1): 186–208.

Hartmanis, J., & Stearns, R. E. 1965. On the computational complexity of algorithms. Transactions of the American Mathematical Society, 117: 285-306.

Originating area
Computer Science, Mathematics

Level of analysis
System

IS articles that use the theory
Aslan, Heba K. 2004. A scalable and distributed multicast security protocol using a subgroup-key hierarchy. Computers and Security, 23(4): 320-329.

Berardi, Daniela, Calvanese, Diego, Giacomo , Giuseppe, De, Lenzerini, Maurizio, & Mecella, Massimo. 2005. Automatic Service Composition Based On

Behavioral Descriptions. International Journal of Cooperative Information Systems, 14(4): 333-376.

Deshpande, Mukund, & Karypis, George. January 2004. Item-based top-N recommendation algorithms. ACM Transactions on Information Systems, 22(1): 143 - 177.

Dominich, Sandor. 2003. Connectionist interaction information retr. Information Processing and Management, 39(2): 167-193.

Francalanci, Chiara, & Piuri, Vincenzo. 1999. Designing information technology architectures: A cost-oriented methodology. Journal of Information Technology (Routledge, Ltd.), 14(2): 181-192.

Glass, Robert L. 2002. Sorting out software complexity, Association for Computing Machinery.Communications of the ACM , 45(11): 19.

Ninghui Li, Mitchell, John C., & Winsborough, William H. 2005. Beyond proof-of-compliance: Security analysis in trust management. Journal of the ACM, 52(3): 474-514.

Pekec, Aleksandar, & Rothkopf, Michael H. 2003. Combinatorial auction design. Management Science, 49(11): 1485-1503.

Sarkar, Sumit, & Ramaswamy, Mysore. 2000. Knowledge base decomposition to facilitate verification. Information Systems Research, 11(3): 260.

Schneberger, S., McLean, E. "The Complexity Cross: Implications for Practice," Communications of the ACM (46:9), September 2003.

Young-Soo Myung, Hyun-joon Kim, & Dong-Wan Tcha. 1999. Design of communication networks with survivability constraints. Management Science, 45(2): 238.

Links from this theory to other theories
Theory of Computation, Theoretical Computer Science

Original Contributor(s)
Hamid Akbari Please feel free to make modifications to this site. In order to do so, you must register. Return to Theories Used in IS Research