Rick C <gnuarm.del...@gmail.com> writes:
Anyone familiar with this?
Meh, there is not really an accepted formal definition, and to some
extent it is a philosophy question. The word algorithm comes from
Arabic, but the modern notion of Turing machines is from the 1920s.
There are a bunch of viewpoints on the topic here:
https://en.wikipedia.org/wiki/Algorithm_characterizations
A further take is here:
https://plato.stanford.edu/entries/computer-science/#Algo
Anyone familiar with this?
On Wednesday, November 2, 2022 at 1:18:56 AM UTC-4, Paul Rubin
wrote:
Rick C <gnuarm.del...@gmail.com> writes:
Anyone familiar with this?
Meh, there is not really an accepted formal definition, and to
some extent it is a philosophy question. The word algorithm comes
from Arabic, but the modern notion of Turing machines is from the
1920s.
There are a bunch of viewpoints on the topic here:
https://en.wikipedia.org/wiki/Algorithm_characterizations
A further take is here:
https://plato.stanford.edu/entries/computer-science/#Algo
If there's no definition, I guess I need to stop using the term.
I recall in one of the early computer science classes I took, a
professor defined an algorithm in a mathematical definition. She
gave a list of properties an algorithm had to have to qualify as an algorithm. She also listed some features that were not required,
such as being a computer program.
I recall these features:
1) Output - without output a procedure is pointless.
2) Input - ??? I want to say input is optional. So a set of steps
to calculate some number of digits of pi would qualify as an
algorithm, in spite of not having inputs.
3) Steps - she used qualifiers that indicated the steps had to be
clear and unambiguous.
4) Finite - the steps must come to an end, i.e. at some point the
algorithm has to produce the result, it can't be infinite.
I don't recall other qualifications, but I think there is at least
one I'm missing. It was not a long list, and, like I've said, I
don't think Input was on the list.
The web searches seem to produce some pretty vague, garbage
definitions, some even saying it is a computer program, which I'm
certain it does not have to be.
Anyone familiar with this?
On 02/11/2022 04:04, Rick C wrote:
I recall in one of the early computer science classes I took, a
professor defined an algorithm in a mathematical definition. She
gave a list of properties an algorithm had to have to qualify as an algorithm. She also listed some features that were not required,
such as being a computer program.
Just remember that any such definition will be a "lie-to-children" - an oversimplification that covers what you need at the time, but not the
full picture.
(That is not a bad thing in any way - it is essential to
all learning. And it's the progress from oversimplifications upwards
that keeps things fun. If you are not familiar with the term "lie-to-children", I recommend "The Science of the Discworld" that popularised it.)
I recall these features:
1) Output - without output a procedure is pointless.
2) Input - ??? I want to say input is optional. So a set of stepsYou could argue that there is always some kind of input. For example,
to calculate some number of digits of pi would qualify as an
algorithm, in spite of not having inputs.
you could say that the digit index or the number of digits is the input
to the "calculate pi" algorithm.
(As an aside, I find it fascinating that there is an algorithm to
calculate the nth digit of the hexadecimal expansion of pi that does not need to calculate all the preceding digits.)
3) Steps - she used qualifiers that indicated the steps had to be
clear and unambiguous.
4) Finite - the steps must come to an end, i.e. at some point the algorithm has to produce the result, it can't be infinite.
Is that really true?
If you can accept a "calculate pi" algorithm that does not have an
input, then it is not finite.
I remember a programming task from my days at university (doing
mathematics and computation), writing a function that calculated the
digits of pi. The language in question was a functional programming
language similar to Haskell. The end result was a function that
returned an infinite list - you could then print out as many digits from that list as you wanted.
So what was the output? What kind of output do you allow from your algorithms? Does the algorithm have to stop and produce and output, or
can it produce a stream of output underway? Can the output be an
algorithm itself? Is it acceptable (according to your definition) for
the output of a "calculate pi" algorithm to be a tuple of a digit of pi
and an algorithm to calculate the next digit-algorithm tuple?
I don't recall other qualifications, but I think there is at least
one I'm missing. It was not a long list, and, like I've said, I
don't think Input was on the list.
Was it "deterministic" ? Not all algorithms are deterministic and repeatable, but it is often a very useful characteristic, and it might
have been relevant for the course you were doing at the time.
The web searches seem to produce some pretty vague, garbage
definitions, some even saying it is a computer program, which I'm
certain it does not have to be.
Anyone familiar with this?
I think it is unlikely that you'll get a perfect match for your
professor's definition, except by luck - because I don't think there
really is a single definition. I don't think there is even a single consistent definition for your features "output", "input", "steps", or "finite" (in this context).
That's why Turing went to such effort to define his Turing Machine (and
some other mathematicians of the time came up with alternative
"universal computers"). If you want a rigorous definition, you have to
go back to 0's and 1's and something mathematically precise such as a TM
or universal register machine (which is easier to program than a TM).
Of course, you are then restricting your algorithms - it no longer
includes a recipe for brownies, for example, but it does give you
something you can define and reason about.
On 02/11/2022 06:42, Rick C wrote:
On Wednesday, November 2, 2022 at 1:18:56 AM UTC-4, Paul Rubin
wrote:
Rick C <gnuarm.del...@gmail.com> writes:
Anyone familiar with this?
Meh, there is not really an accepted formal definition, and to
some extent it is a philosophy question. The word algorithm comes
from Arabic, but the modern notion of Turing machines is from the
1920s.
There are a bunch of viewpoints on the topic here:
https://en.wikipedia.org/wiki/Algorithm_characterizations
A further take is here:
https://plato.stanford.edu/entries/computer-science/#Algo
If there's no definition, I guess I need to stop using the term.If you stopped using a term just because there is no single formal definition, you'd have to stop saying anything at all. After all,
there's no definition for the terms "definition" or "term".
It just means that if you are writing something that is supposed to be rigorous, you have to define what /you/ mean by a given term in the
context of the discussion.
...
You could argue that there is always some kind of input. For example,
2) Input - ??? I want to say input is optional. So a set of steps
to calculate some number of digits of pi would qualify as an
algorithm, in spite of not having inputs.
you could say that the digit index or the number of digits is the input
to the "calculate pi" algorithm.
You can argue anything. A procedure to calculate pi without specifying how many digits would not be an algorithm because of the "finite" requirement. It doesn't need an input to set the number of digits if that is built into the procedure.
.....
Is that really true?
4) Finite - the steps must come to an end, i.e. at some point the
algorithm has to produce the result, it can't be infinite.
If you can accept a "calculate pi" algorithm that does not have an
input, then it is not finite.
Not true.
On Wednesday, November 2, 2022 at 4:49:26 AM UTC-4, David Brown wrote:
I remember a programming task from my days at university (doing
mathematics and computation), writing a function that calculated the
digits of pi. The language in question was a functional programming
language similar to Haskell. The end result was a function that
returned an infinite list - you could then print out as many digits from
that list as you wanted.
How did it return an infinite list?
Was it "deterministic" ? Not all algorithms are deterministic and
repeatable, but it is often a very useful characteristic, and it might
have been relevant for the course you were doing at the time.
If it's not deterministic, it's not an algorithm. Flipping a coin is
not an algorithm.
On Wednesday, November 2, 2022 at 4:49:26 AM UTC-4, David Brown wrote:
On 02/11/2022 04:04, Rick C wrote:
I recall in one of the early computer science classes I took, aJust remember that any such definition will be a "lie-to-children" - an
professor defined an algorithm in a mathematical definition. She
gave a list of properties an algorithm had to have to qualify as an
algorithm. She also listed some features that were not required,
such as being a computer program.
oversimplification that covers what you need at the time, but not the
full picture.
Sorry, I have no idea what you are talking about.
(That is not a bad thing in any way - it is essential to
all learning. And it's the progress from oversimplifications upwards
that keeps things fun. If you are not familiar with the term
"lie-to-children", I recommend "The Science of the Discworld" that
popularised it.)
I recall these features:You could argue that there is always some kind of input. For example,
1) Output - without output a procedure is pointless.
2) Input - ??? I want to say input is optional. So a set of steps
to calculate some number of digits of pi would qualify as an
algorithm, in spite of not having inputs.
you could say that the digit index or the number of digits is the input
to the "calculate pi" algorithm.
You can argue anything. A procedure to calculate pi without
specifying how many digits would not be an algorithm because of the
"finite" requirement. It doesn't need an input to set the number of
digits if that is built into the procedure.
(As an aside, I find it fascinating that there is an algorithm to
calculate the nth digit of the hexadecimal expansion of pi that does not
need to calculate all the preceding digits.)
Is that really true?
3) Steps - she used qualifiers that indicated the steps had to be
clear and unambiguous.
4) Finite - the steps must come to an end, i.e. at some point the
algorithm has to produce the result, it can't be infinite.
If you can accept a "calculate pi" algorithm that does not have an
input, then it is not finite.
Not true.
If it has no definite end, it is not an algorithm. That's why an
operating system is not typically an algorithm, it has no specific termination unless you have a command to shut down the computer, I suppose.
I remember a programming task from my days at university (doing
mathematics and computation), writing a function that calculated the
digits of pi. The language in question was a functional programming
language similar to Haskell. The end result was a function that
returned an infinite list - you could then print out as many digits from
that list as you wanted.
How did it return an infinite list? You mean it returned as many
digits as you specified or waited to have calculated? If you have an
infinite storage system, that's a pretty remarkable achievement. But I expect it would be powered by an infinite improbability drive.
So what was the output? What kind of output do you allow from your
algorithms? Does the algorithm have to stop and produce and output, or
can it produce a stream of output underway? Can the output be an
algorithm itself? Is it acceptable (according to your definition) for
the output of a "calculate pi" algorithm to be a tuple of a digit of pi
and an algorithm to calculate the next digit-algorithm tuple?
I don't recall other qualifications, but I think there is at leastWas it "deterministic" ? Not all algorithms are deterministic and
one I'm missing. It was not a long list, and, like I've said, I
don't think Input was on the list.
repeatable, but it is often a very useful characteristic, and it might
have been relevant for the course you were doing at the time.
If it's not deterministic, it's not an algorithm. Flipping a coin is not an algorithm.
The web searches seem to produce some pretty vague, garbageI think it is unlikely that you'll get a perfect match for your
definitions, some even saying it is a computer program, which I'm
certain it does not have to be.
Anyone familiar with this?
professor's definition, except by luck - because I don't think there
really is a single definition. I don't think there is even a single
consistent definition for your features "output", "input", "steps", or
"finite" (in this context).
Ok, that explains a lot about software development.
That's why Turing went to such effort to define his Turing Machine (and
some other mathematicians of the time came up with alternative
"universal computers"). If you want a rigorous definition, you have to
go back to 0's and 1's and something mathematically precise such as a TM
or universal register machine (which is easier to program than a TM).
Of course, you are then restricting your algorithms - it no longer
includes a recipe for brownies, for example, but it does give you
something you can define and reason about.
Nothing useful here. Thanks anyway.
On 2022-11-02 20:45, Rick C wrote:
On Wednesday, November 2, 2022 at 4:49:26 AM UTC-4, David Brown
wrote:
[snip]
I remember a programming task from my days at university (doing
mathematics and computation), writing a function that calculated
the digits of pi. The language in question was a functional
programming language similar to Haskell. The end result was a
function that returned an infinite list - you could then print
out as many digits from that list as you wanted.
How did it return an infinite list?
Probably by using a non-strict evaluation order (https://en.wikipedia.org/wiki/Evaluation_strategy#Non-strict_evaluation) where a data object can be "unbounded" or "potentially infinite" --
only the parts of the data structure that are later accessed become
"real" data held in memory.
You could of course say that the part (function) of the program that
produced the potentially infinite list is not an "algorithm" by
itself, and becomes an algorithm only when combined with the part of
the program that outputs a finite part of the list. But that seems
too limited, because it that function is clearly independent of the
rest of the program and implements a well-defined computation.
Was it "deterministic" ? Not all algorithms are deterministic
and repeatable, but it is often a very useful characteristic, and
it might have been relevant for the course you were doing at the
time.
If it's not deterministic, it's not an algorithm. Flipping a coin
is not an algorithm.
Randomized or probabilistic algorithms are a huge research subject
with many practical applications, for example for the approximate
solution of hard optimization problems (https://en.wikipedia.org/wiki/Randomized_algorithm).
A very simple example is the "random sample consensus" method
(RanSaC), which is often called an "algorithm" although it is not deterministic
(https://en.wikipedia.org/wiki/Random_sample_consensus).
If there's no definition, I guess I need to stop using the term.
If it has no definite end, it is not an algorithm. ... If it's not deterministic, it's not an algorithm. Flipping a coin is not an
algorithm.
How did it return an infinite list? You mean it returned as many
digits as you specified or waited to have calculated?
If you have an infinite storage system,
On 2022-11-02 20:45, Rick C wrote:
On Wednesday, November 2, 2022 at 4:49:26 AM UTC-4, David Brown wrote:[snip]
I remember a programming task from my days at university (doing
mathematics and computation), writing a function that calculated the
digits of pi. The language in question was a functional programming
language similar to Haskell. The end result was a function that
returned an infinite list - you could then print out as many digits from >> that list as you wanted.
How did it return an infinite list?Probably by using a non-strict evaluation order (https://en.wikipedia.org/wiki/Evaluation_strategy#Non-strict_evaluation) where a data object can be "unbounded" or "potentially infinite" -- only
the parts of the data structure that are later accessed become "real"
data held in memory.
You could of course say that the part (function) of the program that
produced the potentially infinite list is not an "algorithm" by itself,
and becomes an algorithm only when combined with the part of the program
that outputs a finite part of the list. But that seems too limited,
because it that function is clearly independent of the rest of the
program and implements a well-defined computation.
Was it "deterministic" ? Not all algorithms are deterministic and
repeatable, but it is often a very useful characteristic, and it might
have been relevant for the course you were doing at the time.
If it's not deterministic, it's not an algorithm. Flipping a coin isRandomized or probabilistic algorithms are a huge research subject with
not an algorithm.
many practical applications, for example for the approximate solution of
hard optimization problems (https://en.wikipedia.org/wiki/Randomized_algorithm).
A very simple example is the "random sample consensus" method (RanSaC),
which is often called an "algorithm" although it is not deterministic (https://en.wikipedia.org/wiki/Random_sample_consensus).
Much of the engineering and computer science I learned in college was
taught like math. Terms were defined, in order to know what is being
said. Look at math. It's as much about the definitions as the
various rules.
If a term like algorithm is not well defined, then I won't use it in engineering unless I define it first.
On 11/2/2022 20:45, Rick C wrote:
...
You could argue that there is always some kind of input. For example,
2) Input - ??? I want to say input is optional. So a set of steps
to calculate some number of digits of pi would qualify as an
algorithm, in spite of not having inputs.
you could say that the digit index or the number of digits is the input
to the "calculate pi" algorithm.
You can argue anything. A procedure to calculate pi without specifying how many digits would not be an algorithm because of the "finite" requirement. It doesn't need an input to set the number of digits if that is built into the procedure.Of course it is input. You change the number and the output changes.
4) Finite - the steps must come to an end, i.e. at some point theIs that really true?
algorithm has to produce the result, it can't be infinite.
If you can accept a "calculate pi" algorithm that does not have an
input, then it is not finite.
Not true.Of course it is true. Unless you specify the limits for an operation
which is infinite in nature you have the calculating algorithm
running infinitely.
No need to go about calculating Pi, divide 1 by 3 using the
algorithm taught at primary school using a pencil.
Without giving it much thought I'd say an algorithm is an unambiguous flowchart made up in order to achieve some goal. Anything more than that would take some qualifier to the word "algorithm".
On Wednesday, November 2, 2022 at 4:18:00 AM UTC-4, David Brown
wrote:
On 02/11/2022 06:42, Rick C wrote:
On Wednesday, November 2, 2022 at 1:18:56 AM UTC-4, Paul RubinIf you stopped using a term just because there is no single formal
wrote:
Rick C <gnuarm.del...@gmail.com> writes:
Anyone familiar with this?
Meh, there is not really an accepted formal definition, and to
some extent it is a philosophy question. The word algorithm
comes from Arabic, but the modern notion of Turing machines is
from the 1920s.
There are a bunch of viewpoints on the topic here:
https://en.wikipedia.org/wiki/Algorithm_characterizations
A further take is here:
https://plato.stanford.edu/entries/computer-science/#Algo
If there's no definition, I guess I need to stop using the term.
definition, you'd have to stop saying anything at all. After all,
there's no definition for the terms "definition" or "term".
It just means that if you are writing something that is supposed to
be rigorous, you have to define what /you/ mean by a given term in
the context of the discussion.
Your idea that language is not defined is not valid. They are
defined in a dictionary, actually, many, not unlike standards, lol.
Terms with jargon meaning need their specific definition in that
context, or they have little value.
Much of the engineering and computer science I learned in college was
taught like math. Terms were defined, in order to know what is being
said. Look at math. It's as much about the definitions as the
various rules.
If a term like algorithm is not well defined, then I won't use it in engineering unless I define it first.
On Wednesday, November 2, 2022 at 3:29:38 PM UTC-4, Dimiter wrote:
On 11/2/2022 20:45, Rick C wrote:
...Of course it is input. You change the number and the output changes.
You could argue that there is always some kind of input. For example,
2) Input - ??? I want to say input is optional. So a set of steps
to calculate some number of digits of pi would qualify as an
algorithm, in spite of not having inputs.
you could say that the digit index or the number of digits is the input >>>> to the "calculate pi" algorithm.
You can argue anything. A procedure to calculate pi without specifying how many digits would not be an algorithm because of the "finite" requirement. It doesn't need an input to set the number of digits if that is built into the procedure.
If it's part of the procedure, that's not input by definition.
Of course it is true. Unless you specify the limits for an operation4) Finite - the steps must come to an end, i.e. at some point theIs that really true?
algorithm has to produce the result, it can't be infinite.
If you can accept a "calculate pi" algorithm that does not have an
input, then it is not finite.
Not true.
which is infinite in nature you have the calculating algorithm
running infinitely.
The procedure can specify a number of digits. Why are you arguing about this???
No need to go about calculating Pi, divide 1 by 3 using the
algorithm taught at primary school using a pencil.
Why bother with that? Just define it as 3 and be done!
Without giving it much thought I'd say an algorithm is an unambiguous
flowchart made up in order to achieve some goal. Anything more than that
would take some qualifier to the word "algorithm".
Must be nice to be the guy who makes the definitions.
[David Brown's] idea that language is not defined is not valid. They
are defined in a dictionary, actually, many, not unlike standards, lol.
Terms with jargon meaning need their specific definition in that context,
or they have little value.
Sysop: | Keyop |
---|---|
Location: | Huddersfield, West Yorkshire, UK |
Users: | 546 |
Nodes: | 16 (2 / 14) |
Uptime: | 148:02:52 |
Calls: | 10,383 |
Calls today: | 8 |
Files: | 14,054 |
D/L today: |
2 files (1,861K bytes) |
Messages: | 6,417,737 |