The Impostor's Handbook - Rob Conery
What is computation? This has always been the most fundamental question of our field. In the 1930s, as the field was starting, the answer was that computation was the action of people who operated calculator machines. By the late 1940s, the answer was that computation was steps carried out by automated computers to produce definite outputs. That definition did very well: it remained the standard for nearly fifty years. But it is now being challenged. People in many fields have accepted that computational thinking is a way of approaching science and engineering. The Internet is full of servers that provide nonstop computation endlessly. Researchers in biology and physics have claimed the discovery of natural computational processes that have nothing to do with computers. How must our definition evolve to answer the challenges of brains computing, algorithms never terminating by design, computation as a natural occurrence, and computation without computers?
All these definitions frame computation as the actions of an agent carrying out computational steps. New definitions will focus on new agents: their matches to real systems, their explanatory and predictive powers, and their ability to support new designs. There have been some real surprises about what can be a computational agent and more lie ahead.
Complexity centers around the question: how difficult is a problem to solve?
The key to Complexity Theory: you understand complexity in terms of time as you scale the inputs that go into the algorithm that you're using to solve the problem
T = 2x + 4y + 7
When the complexity of an algorithm scales according to a polynomial expression, we say that it scales in "P time". These are typically the simplest to work with.
T = 2^n
Hard problems are exponentially complex and take a very, very long time to solve.