As title. I'm, fortunately, reading the Wikipedia page titled "Effective Method", unfortunately. (1. I love the fact that it's open-source/available to everyone on Earth, 2. I hate unverified documents)
I don't think that the existing "Definition" section is formal, based on the fact that a very prevalent/similar/related (and some said "equivalent") idea called "algorithm", where it reads "Algorithm does not have a generally accepted formal definition.". I agree with that after reading the history on the same page. I have tried searching at least three books from those being referenced, but I still can't find a book that formally define what is a "effective method". (What make the situation worse is that by Googling it, the term seems to also be used in Philosophy.) Do we have any expert here who could answer me:
- Is this term important (but how can't it be not important if it's related to the term "algorithm")?
- Have we ever formally defined it before (in the history of computers)?
The false-positive, "I almost found it!" moment was when I was reading this line from Robin Gandy's (a student of Turing) Church's Thesis and Principles for Mechanisms,
The word "effective" in the thesis serves to emphasize that the process of calculation is deterministic - not dependent on guesswork - and that it must terminate after a finite time.
But unfortunately, this line is an explanation for something "related" but not "exactly",
Church's Thesis What is effectively calculable is computable.
If I pick this "definition", albeit from an authoritative source, then I implicitly assume that the underlying "model/runner/machine/whatever" is a Turing Machine. But it seems that it could be just an instance of some "entity" with "an effective method".
2 Answers 2
The term "effectively calculable function" is historically important. It is a pre-mathematical idea (and therefore informal by its very nature) that some things can be calculated by following instructions that require no ingenuity or magic of any kind. There have been several attempts to make this pre-mathematical idea precise, as the Wikipedia page you linked to already explains. These attempts were very successful, as they essentially lead to modern computer science and computers. You can stop looking for a formal definition.
There are other similar situations. For example, the pre-mathematical ideas of "number" and "space" have their formal counter-parts in mathematics. In fact, there are many formalizations of "number" and "space". Likewise, there are many formal mathematical models of what "computation" is. Despite what you will read on Wikipedia, they're not all equivalent. (They happen to compute the same number-theoretic functions, which is a rather weak notion of equivalence.)
-
$\begingroup$ Can you say more about "not equivalent"? Do you mean their different runtimes, or more? $\endgroup$Jim Hefferon– Jim Hefferon2025年05月16日 13:12:46 +00:00Commented May 16 at 13:12
-
1$\begingroup$ No, this is not about runtime, it's about what these models can compute. For example, they may or may not compute the so-called "modulus of continuity" functonal, see math.andrej.com/2006/03/27/… $\endgroup$Andrej Bauer– Andrej Bauer2025年05月16日 13:36:04 +00:00Commented May 16 at 13:36
It is not precisely defined. It is a method or procedure that can be performed in reasonable time, up to you to define "reasonable time".
For me it is "whatever a computer that I can buy for 5,000ドル can do in its lifetime". You can change it to 2,000ドル and 24 hours, or one minute, if you prefer that.
And usually there is some parameter, say "problem size", that affects the time. For example if your method takes n^3 nanoseconds per core, then a problem of size 1,000,000 will take about two years if you have 16 cores.