• Definition of an Algorithm

    From Rick C@21:1/5 to All on Tue Nov 1 20:04:09 2022
    I recall in one of the early computer science classes I took, a professor defined an algorithm in a mathematical definition. She gave a list of properties an algorithm had to have to qualify as an algorithm. She also listed some features that were not
    required, such as being a computer program.

    I recall these features:

    1) Output - without output a procedure is pointless.

    2) Input - ??? I want to say input is optional. So a set of steps to calculate some number of digits of pi would qualify as an algorithm, in spite of not having inputs.

    3) Steps - she used qualifiers that indicated the steps had to be clear and unambiguous.

    4) Finite - the steps must come to an end, i.e. at some point the algorithm has to produce the result, it can't be infinite.

    I don't recall other qualifications, but I think there is at least one I'm missing. It was not a long list, and, like I've said, I don't think Input was on the list.

    The web searches seem to produce some pretty vague, garbage definitions, some even saying it is a computer program, which I'm certain it does not have to be.

    Anyone familiar with this?

    --

    Rick C.

    - Get 1,000 miles of free Supercharging
    - Tesla referral code - https://ts.la/richard11209

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Rick C@21:1/5 to Paul Rubin on Tue Nov 1 22:42:06 2022
    On Wednesday, November 2, 2022 at 1:18:56 AM UTC-4, Paul Rubin wrote:
    Rick C <gnuarm.del...@gmail.com> writes:
    Anyone familiar with this?

    Meh, there is not really an accepted formal definition, and to some
    extent it is a philosophy question. The word algorithm comes from
    Arabic, but the modern notion of Turing machines is from the 1920s.

    There are a bunch of viewpoints on the topic here:

    https://en.wikipedia.org/wiki/Algorithm_characterizations

    A further take is here:

    https://plato.stanford.edu/entries/computer-science/#Algo

    If there's no definition, I guess I need to stop using the term.

    The Wikipedia article starts off as a poor substitute for what it's supposed to be, with no reference anywhere in the section. This is one of the worst examples I've ever seen, even for Wikipedia. I'm amazed it was ever allowed. Probably a matter of
    the camel getting its nose under the tent.

    But the section that actually tried to define an algorithm (Features of a good algorithm) seems to hit the nail on the head, as far as I can see. It mentions input though, which I'm pretty sure can be a null set without impacting that the process is an
    algorithm. So factually, input is not required, and not part of the definition.

    --

    Rick C.

    + Get 1,000 miles of free Supercharging
    + Tesla referral code - https://ts.la/richard11209

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Paul Rubin@21:1/5 to Rick C on Tue Nov 1 22:18:51 2022
    Rick C <gnuarm.deletethisbit@gmail.com> writes:
    Anyone familiar with this?

    Meh, there is not really an accepted formal definition, and to some
    extent it is a philosophy question. The word algorithm comes from
    Arabic, but the modern notion of Turing machines is from the 1920s.

    There are a bunch of viewpoints on the topic here:

    https://en.wikipedia.org/wiki/Algorithm_characterizations

    A further take is here:

    https://plato.stanford.edu/entries/computer-science/#Algo

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From David Brown@21:1/5 to Rick C on Wed Nov 2 09:17:54 2022
    On 02/11/2022 06:42, Rick C wrote:
    On Wednesday, November 2, 2022 at 1:18:56 AM UTC-4, Paul Rubin
    wrote:
    Rick C <gnuarm.del...@gmail.com> writes:
    Anyone familiar with this?

    Meh, there is not really an accepted formal definition, and to
    some extent it is a philosophy question. The word algorithm comes
    from Arabic, but the modern notion of Turing machines is from the
    1920s.

    There are a bunch of viewpoints on the topic here:

    https://en.wikipedia.org/wiki/Algorithm_characterizations

    A further take is here:

    https://plato.stanford.edu/entries/computer-science/#Algo

    If there's no definition, I guess I need to stop using the term.

    If you stopped using a term just because there is no single formal
    definition, you'd have to stop saying anything at all. After all,
    there's no definition for the terms "definition" or "term".

    It just means that if you are writing something that is supposed to be rigorous, you have to define what /you/ mean by a given term in the
    context of the discussion.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From David Brown@21:1/5 to Rick C on Wed Nov 2 09:49:19 2022
    On 02/11/2022 04:04, Rick C wrote:
    I recall in one of the early computer science classes I took, a
    professor defined an algorithm in a mathematical definition. She
    gave a list of properties an algorithm had to have to qualify as an algorithm. She also listed some features that were not required,
    such as being a computer program.


    Just remember that any such definition will be a "lie-to-children" - an oversimplification that covers what you need at the time, but not the
    full picture. (That is not a bad thing in any way - it is essential to
    all learning. And it's the progress from oversimplifications upwards
    that keeps things fun. If you are not familiar with the term "lie-to-children", I recommend "The Science of the Discworld" that
    popularised it.)

    I recall these features:

    1) Output - without output a procedure is pointless.

    2) Input - ??? I want to say input is optional. So a set of steps
    to calculate some number of digits of pi would qualify as an
    algorithm, in spite of not having inputs.

    You could argue that there is always some kind of input. For example,
    you could say that the digit index or the number of digits is the input
    to the "calculate pi" algorithm.

    (As an aside, I find it fascinating that there is an algorithm to
    calculate the nth digit of the hexadecimal expansion of pi that does not
    need to calculate all the preceding digits.)


    3) Steps - she used qualifiers that indicated the steps had to be
    clear and unambiguous.

    4) Finite - the steps must come to an end, i.e. at some point the
    algorithm has to produce the result, it can't be infinite.


    Is that really true?

    If you can accept a "calculate pi" algorithm that does not have an
    input, then it is not finite.

    I remember a programming task from my days at university (doing
    mathematics and computation), writing a function that calculated the
    digits of pi. The language in question was a functional programming
    language similar to Haskell. The end result was a function that
    returned an infinite list - you could then print out as many digits from
    that list as you wanted.

    So what was the output? What kind of output do you allow from your
    algorithms? Does the algorithm have to stop and produce and output, or
    can it produce a stream of output underway? Can the output be an
    algorithm itself? Is it acceptable (according to your definition) for
    the output of a "calculate pi" algorithm to be a tuple of a digit of pi
    and an algorithm to calculate the next digit-algorithm tuple?


    I don't recall other qualifications, but I think there is at least
    one I'm missing. It was not a long list, and, like I've said, I
    don't think Input was on the list.


    Was it "deterministic" ? Not all algorithms are deterministic and
    repeatable, but it is often a very useful characteristic, and it might
    have been relevant for the course you were doing at the time.

    The web searches seem to produce some pretty vague, garbage
    definitions, some even saying it is a computer program, which I'm
    certain it does not have to be.

    Anyone familiar with this?


    I think it is unlikely that you'll get a perfect match for your
    professor's definition, except by luck - because I don't think there
    really is a single definition. I don't think there is even a single
    consistent definition for your features "output", "input", "steps", or
    "finite" (in this context).

    That's why Turing went to such effort to define his Turing Machine (and
    some other mathematicians of the time came up with alternative
    "universal computers"). If you want a rigorous definition, you have to
    go back to 0's and 1's and something mathematically precise such as a TM
    or universal register machine (which is easier to program than a TM).
    Of course, you are then restricting your algorithms - it no longer
    includes a recipe for brownies, for example, but it does give you
    something you can define and reason about.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Rick C@21:1/5 to David Brown on Wed Nov 2 11:45:40 2022
    On Wednesday, November 2, 2022 at 4:49:26 AM UTC-4, David Brown wrote:
    On 02/11/2022 04:04, Rick C wrote:
    I recall in one of the early computer science classes I took, a
    professor defined an algorithm in a mathematical definition. She
    gave a list of properties an algorithm had to have to qualify as an algorithm. She also listed some features that were not required,
    such as being a computer program.

    Just remember that any such definition will be a "lie-to-children" - an oversimplification that covers what you need at the time, but not the
    full picture.

    Sorry, I have no idea what you are talking about.


    (That is not a bad thing in any way - it is essential to
    all learning. And it's the progress from oversimplifications upwards
    that keeps things fun. If you are not familiar with the term "lie-to-children", I recommend "The Science of the Discworld" that popularised it.)
    I recall these features:

    1) Output - without output a procedure is pointless.

    2) Input - ??? I want to say input is optional. So a set of steps
    to calculate some number of digits of pi would qualify as an
    algorithm, in spite of not having inputs.
    You could argue that there is always some kind of input. For example,
    you could say that the digit index or the number of digits is the input
    to the "calculate pi" algorithm.

    You can argue anything. A procedure to calculate pi without specifying how many digits would not be an algorithm because of the "finite" requirement. It doesn't need an input to set the number of digits if that is built into the procedure.


    (As an aside, I find it fascinating that there is an algorithm to
    calculate the nth digit of the hexadecimal expansion of pi that does not need to calculate all the preceding digits.)

    3) Steps - she used qualifiers that indicated the steps had to be
    clear and unambiguous.

    4) Finite - the steps must come to an end, i.e. at some point the algorithm has to produce the result, it can't be infinite.

    Is that really true?

    If you can accept a "calculate pi" algorithm that does not have an
    input, then it is not finite.

    Not true.

    If it has no definite end, it is not an algorithm. That's why an operating system is not typically an algorithm, it has no specific termination unless you have a command to shut down the computer, I suppose.


    I remember a programming task from my days at university (doing
    mathematics and computation), writing a function that calculated the
    digits of pi. The language in question was a functional programming
    language similar to Haskell. The end result was a function that
    returned an infinite list - you could then print out as many digits from that list as you wanted.

    How did it return an infinite list? You mean it returned as many digits as you specified or waited to have calculated? If you have an infinite storage system, that's a pretty remarkable achievement. But I expect it would be powered by an infinite
    improbability drive.


    So what was the output? What kind of output do you allow from your algorithms? Does the algorithm have to stop and produce and output, or
    can it produce a stream of output underway? Can the output be an
    algorithm itself? Is it acceptable (according to your definition) for
    the output of a "calculate pi" algorithm to be a tuple of a digit of pi
    and an algorithm to calculate the next digit-algorithm tuple?
    I don't recall other qualifications, but I think there is at least
    one I'm missing. It was not a long list, and, like I've said, I
    don't think Input was on the list.

    Was it "deterministic" ? Not all algorithms are deterministic and repeatable, but it is often a very useful characteristic, and it might
    have been relevant for the course you were doing at the time.

    If it's not deterministic, it's not an algorithm. Flipping a coin is not an algorithm.


    The web searches seem to produce some pretty vague, garbage
    definitions, some even saying it is a computer program, which I'm
    certain it does not have to be.

    Anyone familiar with this?

    I think it is unlikely that you'll get a perfect match for your
    professor's definition, except by luck - because I don't think there
    really is a single definition. I don't think there is even a single consistent definition for your features "output", "input", "steps", or "finite" (in this context).

    Ok, that explains a lot about software development.


    That's why Turing went to such effort to define his Turing Machine (and
    some other mathematicians of the time came up with alternative
    "universal computers"). If you want a rigorous definition, you have to
    go back to 0's and 1's and something mathematically precise such as a TM
    or universal register machine (which is easier to program than a TM).
    Of course, you are then restricting your algorithms - it no longer
    includes a recipe for brownies, for example, but it does give you
    something you can define and reason about.

    Nothing useful here. Thanks anyway.

    --

    Rick C.

    -+ Get 1,000 miles of free Supercharging
    -+ Tesla referral code - https://ts.la/richard11209

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Rick C@21:1/5 to David Brown on Wed Nov 2 11:37:18 2022
    On Wednesday, November 2, 2022 at 4:18:00 AM UTC-4, David Brown wrote:
    On 02/11/2022 06:42, Rick C wrote:
    On Wednesday, November 2, 2022 at 1:18:56 AM UTC-4, Paul Rubin
    wrote:
    Rick C <gnuarm.del...@gmail.com> writes:
    Anyone familiar with this?

    Meh, there is not really an accepted formal definition, and to
    some extent it is a philosophy question. The word algorithm comes
    from Arabic, but the modern notion of Turing machines is from the
    1920s.

    There are a bunch of viewpoints on the topic here:

    https://en.wikipedia.org/wiki/Algorithm_characterizations

    A further take is here:

    https://plato.stanford.edu/entries/computer-science/#Algo

    If there's no definition, I guess I need to stop using the term.
    If you stopped using a term just because there is no single formal definition, you'd have to stop saying anything at all. After all,
    there's no definition for the terms "definition" or "term".

    It just means that if you are writing something that is supposed to be rigorous, you have to define what /you/ mean by a given term in the
    context of the discussion.

    Your idea that language is not defined is not valid. They are defined in a dictionary, actually, many, not unlike standards, lol. Terms with jargon meaning need their specific definition in that context, or they have little value.

    Much of the engineering and computer science I learned in college was taught like math. Terms were defined, in order to know what is being said. Look at math. It's as much about the definitions as the various rules.

    If a term like algorithm is not well defined, then I won't use it in engineering unless I define it first.

    --

    Rick C.

    -- Get 1,000 miles of free Supercharging
    -- Tesla referral code - https://ts.la/richard11209

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Dimiter_Popoff@21:1/5 to Rick C on Wed Nov 2 21:29:22 2022
    On 11/2/2022 20:45, Rick C wrote:
    ...

    2) Input - ??? I want to say input is optional. So a set of steps
    to calculate some number of digits of pi would qualify as an
    algorithm, in spite of not having inputs.
    You could argue that there is always some kind of input. For example,
    you could say that the digit index or the number of digits is the input
    to the "calculate pi" algorithm.

    You can argue anything. A procedure to calculate pi without specifying how many digits would not be an algorithm because of the "finite" requirement. It doesn't need an input to set the number of digits if that is built into the procedure.

    Of course it is input. You change the number and the output changes.

    .....

    4) Finite - the steps must come to an end, i.e. at some point the
    algorithm has to produce the result, it can't be infinite.

    Is that really true?

    If you can accept a "calculate pi" algorithm that does not have an
    input, then it is not finite.

    Not true.

    Of course it is true. Unless you specify the limits for an operation
    which is infinite in nature you have the calculating algorithm
    running infinitely.
    No need to go about calculating Pi, divide 1 by 3 using the
    algorithm taught at primary school using a pencil.

    Without giving it much thought I'd say an algorithm is an unambiguous
    flowchart made up in order to achieve some goal. Anything more than that
    would take some qualifier to the word "algorithm".

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Niklas Holsti@21:1/5 to Rick C on Wed Nov 2 21:53:04 2022
    On 2022-11-02 20:45, Rick C wrote:
    On Wednesday, November 2, 2022 at 4:49:26 AM UTC-4, David Brown wrote:

    [snip]


    I remember a programming task from my days at university (doing
    mathematics and computation), writing a function that calculated the
    digits of pi. The language in question was a functional programming
    language similar to Haskell. The end result was a function that
    returned an infinite list - you could then print out as many digits from
    that list as you wanted.

    How did it return an infinite list?


    Probably by using a non-strict evaluation order (https://en.wikipedia.org/wiki/Evaluation_strategy#Non-strict_evaluation)
    where a data object can be "unbounded" or "potentially infinite" -- only
    the parts of the data structure that are later accessed become "real"
    data held in memory.

    You could of course say that the part (function) of the program that
    produced the potentially infinite list is not an "algorithm" by itself,
    and becomes an algorithm only when combined with the part of the program
    that outputs a finite part of the list. But that seems too limited,
    because it that function is clearly independent of the rest of the
    program and implements a well-defined computation.


    Was it "deterministic" ? Not all algorithms are deterministic and
    repeatable, but it is often a very useful characteristic, and it might
    have been relevant for the course you were doing at the time.

    If it's not deterministic, it's not an algorithm. Flipping a coin is
    not an algorithm.

    Randomized or probabilistic algorithms are a huge research subject with
    many practical applications, for example for the approximate solution of
    hard optimization problems (https://en.wikipedia.org/wiki/Randomized_algorithm).

    A very simple example is the "random sample consensus" method (RanSaC),
    which is often called an "algorithm" although it is not deterministic (https://en.wikipedia.org/wiki/Random_sample_consensus).

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From David Brown@21:1/5 to Rick C on Wed Nov 2 22:12:33 2022
    On 02/11/2022 19:45, Rick C wrote:
    On Wednesday, November 2, 2022 at 4:49:26 AM UTC-4, David Brown wrote:
    On 02/11/2022 04:04, Rick C wrote:
    I recall in one of the early computer science classes I took, a
    professor defined an algorithm in a mathematical definition. She
    gave a list of properties an algorithm had to have to qualify as an
    algorithm. She also listed some features that were not required,
    such as being a computer program.

    Just remember that any such definition will be a "lie-to-children" - an
    oversimplification that covers what you need at the time, but not the
    full picture.

    Sorry, I have no idea what you are talking about.

    When you are explaining a complicated subject to someone who does not
    know about it (and that's typically the case at school or university),
    you simplify - you are not telling the truth, you are telling a "lie to children". If the professor were to try to tell you all about what she understood by the meaning of "algorithm", it would take the full course
    instead of a class. You were given a "lie to children" - and that is
    entirely appropriate.

    You appear to be mistaking it for a complete definition, which is less good.



    (That is not a bad thing in any way - it is essential to
    all learning. And it's the progress from oversimplifications upwards
    that keeps things fun. If you are not familiar with the term
    "lie-to-children", I recommend "The Science of the Discworld" that
    popularised it.)
    I recall these features:

    1) Output - without output a procedure is pointless.

    2) Input - ??? I want to say input is optional. So a set of steps
    to calculate some number of digits of pi would qualify as an
    algorithm, in spite of not having inputs.
    You could argue that there is always some kind of input. For example,
    you could say that the digit index or the number of digits is the input
    to the "calculate pi" algorithm.

    You can argue anything. A procedure to calculate pi without
    specifying how many digits would not be an algorithm because of the
    "finite" requirement. It doesn't need an input to set the number of
    digits if that is built into the procedure.


    It is most certainly an algorithm - just not a finite algorithm. Your arguments here are circular.


    (As an aside, I find it fascinating that there is an algorithm to
    calculate the nth digit of the hexadecimal expansion of pi that does not
    need to calculate all the preceding digits.)

    3) Steps - she used qualifiers that indicated the steps had to be
    clear and unambiguous.

    4) Finite - the steps must come to an end, i.e. at some point the
    algorithm has to produce the result, it can't be infinite.

    Is that really true?

    If you can accept a "calculate pi" algorithm that does not have an
    input, then it is not finite.

    Not true.

    Yes, true - pi is not finite.

    It is actually pretty irrelevant whether you consider a way to calculate
    the first ten digits of pi as a function "calculate_pi_10()" or as "calculate_pi(10)". It doesn't matter if the input is fixed in the
    algorithm or not.

    (Can I assume you have never done any functional programming? You would
    find it very enlightening - it is much more appropriate for appreciating computation theory than imperative languages like C or Forth, or
    hardware design languages.)


    If it has no definite end, it is not an algorithm. That's why an
    operating system is not typically an algorithm, it has no specific termination unless you have a command to shut down the computer, I suppose.


    Again, you are giving a circular argument. Your objection to unending algorithms is merely that they don't fit your remembered lesson that
    talked about finite algorithms. (An OS would probably be better
    described as a collection of algorithms rather than a single algorithm.)


    I remember a programming task from my days at university (doing
    mathematics and computation), writing a function that calculated the
    digits of pi. The language in question was a functional programming
    language similar to Haskell. The end result was a function that
    returned an infinite list - you could then print out as many digits from
    that list as you wanted.

    How did it return an infinite list? You mean it returned as many
    digits as you specified or waited to have calculated? If you have an
    infinite storage system, that's a pretty remarkable achievement. But I expect it would be powered by an infinite improbability drive.


    No, it returned an infinite list.

    I know you are familiar with Python, but I don't know at what level of
    detail - let me give a comparison here.

    When you write "range(1000000)", Python generates a list of a million
    numbers from 0 to 999,999. This takes time and memory. But if you
    write "xrange(1000000)", it returns a generator expression, quickly and efficiently. It only actually generates the numbers when it needs to.

    In Haskell, you can write :

    numbers = 1 : [x + 1 | x <- numbers]

    This would be like "numbers = [1] + [x + 1 for x in numbers]" in Python,
    if such recursive statements were allowed.

    This is an infinite list - it behaves in all ways as an unending list of integers.


    So what was the output? What kind of output do you allow from your
    algorithms? Does the algorithm have to stop and produce and output, or
    can it produce a stream of output underway? Can the output be an
    algorithm itself? Is it acceptable (according to your definition) for
    the output of a "calculate pi" algorithm to be a tuple of a digit of pi
    and an algorithm to calculate the next digit-algorithm tuple?
    I don't recall other qualifications, but I think there is at least
    one I'm missing. It was not a long list, and, like I've said, I
    don't think Input was on the list.

    Was it "deterministic" ? Not all algorithms are deterministic and
    repeatable, but it is often a very useful characteristic, and it might
    have been relevant for the course you were doing at the time.

    If it's not deterministic, it's not an algorithm. Flipping a coin is not an algorithm.


    Non-deterministic does not mean random.


    The web searches seem to produce some pretty vague, garbage
    definitions, some even saying it is a computer program, which I'm
    certain it does not have to be.

    Anyone familiar with this?

    I think it is unlikely that you'll get a perfect match for your
    professor's definition, except by luck - because I don't think there
    really is a single definition. I don't think there is even a single
    consistent definition for your features "output", "input", "steps", or
    "finite" (in this context).

    Ok, that explains a lot about software development.


    Software development is a pretty big field.


    That's why Turing went to such effort to define his Turing Machine (and
    some other mathematicians of the time came up with alternative
    "universal computers"). If you want a rigorous definition, you have to
    go back to 0's and 1's and something mathematically precise such as a TM
    or universal register machine (which is easier to program than a TM).
    Of course, you are then restricting your algorithms - it no longer
    includes a recipe for brownies, for example, but it does give you
    something you can define and reason about.

    Nothing useful here. Thanks anyway.


    I can't imagine anything in this thread being "useful" - surely it is
    just an interesting discussion?

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From David Brown@21:1/5 to Niklas Holsti on Wed Nov 2 21:53:24 2022
    On 02/11/2022 20:53, Niklas Holsti wrote:
    On 2022-11-02 20:45, Rick C wrote:
    On Wednesday, November 2, 2022 at 4:49:26 AM UTC-4, David Brown
    wrote:

    [snip]


    I remember a programming task from my days at university (doing
    mathematics and computation), writing a function that calculated
    the digits of pi. The language in question was a functional
    programming language similar to Haskell. The end result was a
    function that returned an infinite list - you could then print
    out as many digits from that list as you wanted.

    How did it return an infinite list?


    Probably by using a non-strict evaluation order (https://en.wikipedia.org/wiki/Evaluation_strategy#Non-strict_evaluation) where a data object can be "unbounded" or "potentially infinite" --
    only the parts of the data structure that are later accessed become
    "real" data held in memory.


    The more relevant term is "lazy evaluation". It is very common in
    functional programming languages, and allows you to work with infinite
    lists, infinite mutually recursive functions, and all kinds of fun.

    You could of course say that the part (function) of the program that
    produced the potentially infinite list is not an "algorithm" by
    itself, and becomes an algorithm only when combined with the part of
    the program that outputs a finite part of the list. But that seems
    too limited, because it that function is clearly independent of the
    rest of the program and implements a well-defined computation.


    The function returns a function as part of the output.


    Was it "deterministic" ? Not all algorithms are deterministic
    and repeatable, but it is often a very useful characteristic, and
    it might have been relevant for the course you were doing at the
    time.

    If it's not deterministic, it's not an algorithm. Flipping a coin
    is not an algorithm.

    Randomized or probabilistic algorithms are a huge research subject
    with many practical applications, for example for the approximate
    solution of hard optimization problems (https://en.wikipedia.org/wiki/Randomized_algorithm).

    A very simple example is the "random sample consensus" method
    (RanSaC), which is often called an "algorithm" although it is not deterministic
    (https://en.wikipedia.org/wiki/Random_sample_consensus).


    All sorts of algorithms are non-deterministic. Pretty much any
    algorithm that is executed in parallel parts is going to have
    non-deterministic execution, which can result in non-deterministic outputs.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Paul Rubin@21:1/5 to Rick C on Wed Nov 2 14:12:27 2022
    Rick C <gnuarm.deletethisbit@gmail.com> writes:
    If there's no definition, I guess I need to stop using the term.

    I wouldn't say there's no definition. I'd say there's no definition
    that everybody agrees on, so if you are trying to make formal claims
    about algorithms, you have to be precise. Otherwise, I would say, the different concepts of algorithm have enough shared features that in most informal usages, you can use the term without confusion.

    The same thing applies to the term "probability", and we all use that
    term anyway.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Paul Rubin@21:1/5 to Rick C on Wed Nov 2 14:45:38 2022
    Rick C <gnuarm.deletethisbit@gmail.com> writes:
    If it has no definite end, it is not an algorithm. ... If it's not deterministic, it's not an algorithm. Flipping a coin is not an
    algorithm.

    There are lots of what are called "randomized algorithms" which involve flipping coins. They are used in practice all the time. An example
    might be a Monte Carlo method for computing an integral. Whether
    every problem solvable by a randomized algorithm can be solved equally efficiently by a non-randomized one is a big unsolved problem in CS
    theory, called "P vs BPP". If you've heard of the more famous P vs NP
    problem, it's sort of similar.

    There aren't necessarily termination guarantees either. Tons of
    practical cryptography programs somewhere contain a routine to generate,
    say, a 100 bit prime number. The method (that I would informally say
    is an algorithm) is:

    1. Generate a random 1000 bit number by flipping coins.
    2. Check whether the resulting number is prime. (There are known
    deterministic methods for this, but in practice one often uses
    a probabilistic test for this step too).
    3. If yes, you are done. Otherwise, go back to step 1 and try again.

    You can make a good probabilistic estimate of how long the above method
    will take, but obviously there is no definite upper bound, since you can
    keep generating composite numbers at step 1.

    Almost everything in real-world engineering depends on randomness too. According to statistical thermodynamics, the air molecules in your room
    are moving around at random, and there is a small chance that all of
    them could coincidentally travel to one small corner of the room. Your
    vacuum cleaner's engineering depends on that not happening, and in
    practice it doesn't. Semiconductors depend on the statistical
    properties of charge carriers, and all that. Randomness is everywhere,
    don't be afraid of it.

    How did it return an infinite list? You mean it returned as many
    digits as you specified or waited to have calculated?

    It's a finite data structure that represents an infinite list, just like
    a sequence of 32 ones and zeros is a data structure that represents an
    integer in a certain range. In Haskell you can say "x = [1,2..]" and
    that means x represents the infinite list 1,2,3,4,5..... If you try
    to print x, you will see 1,2,3,4,5... spewing onto the terminal similar
    to if you ran "1 begin dup . 1+ again" in Forth. More normally you
    would say "print (take 5 x)" to print the first 5 elements of x, and
    get 1,2,3,4,5.

    If you have an infinite storage system,

    There's no infinite storage system, it's more like a symbolic algebra
    system that has a symbol representing the exact number pi. If you
    actually try to print pi, it will start printing 3.14159... using more
    and more computation and storage. But if you say "print cos(pi)", it will print -1 exactly, that sort of thing. There's no infinite storage, just
    some behind the scenes shortcuts.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Rick C@21:1/5 to Niklas Holsti on Wed Nov 2 15:30:08 2022
    On Wednesday, November 2, 2022 at 3:53:10 PM UTC-4, Niklas Holsti wrote:
    On 2022-11-02 20:45, Rick C wrote:
    On Wednesday, November 2, 2022 at 4:49:26 AM UTC-4, David Brown wrote:
    [snip]
    I remember a programming task from my days at university (doing
    mathematics and computation), writing a function that calculated the
    digits of pi. The language in question was a functional programming
    language similar to Haskell. The end result was a function that
    returned an infinite list - you could then print out as many digits from >> that list as you wanted.

    How did it return an infinite list?
    Probably by using a non-strict evaluation order (https://en.wikipedia.org/wiki/Evaluation_strategy#Non-strict_evaluation) where a data object can be "unbounded" or "potentially infinite" -- only
    the parts of the data structure that are later accessed become "real"
    data held in memory.

    You could of course say that the part (function) of the program that
    produced the potentially infinite list is not an "algorithm" by itself,
    and becomes an algorithm only when combined with the part of the program
    that outputs a finite part of the list. But that seems too limited,
    because it that function is clearly independent of the rest of the
    program and implements a well-defined computation.

    Why would it have to be an algorithm?


    Was it "deterministic" ? Not all algorithms are deterministic and
    repeatable, but it is often a very useful characteristic, and it might
    have been relevant for the course you were doing at the time.

    If it's not deterministic, it's not an algorithm. Flipping a coin is
    not an algorithm.
    Randomized or probabilistic algorithms are a huge research subject with
    many practical applications, for example for the approximate solution of
    hard optimization problems (https://en.wikipedia.org/wiki/Randomized_algorithm).

    A very simple example is the "random sample consensus" method (RanSaC),
    which is often called an "algorithm" although it is not deterministic (https://en.wikipedia.org/wiki/Random_sample_consensus).

    Yeah, people often misuse terms.

    --

    Rick C.

    ++ Get 1,000 miles of free Supercharging
    ++ Tesla referral code - https://ts.la/richard11209

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Paul Rubin@21:1/5 to Rick C on Wed Nov 2 15:56:46 2022
    Rick C <gnuarm.deletethisbit@gmail.com> writes:
    Much of the engineering and computer science I learned in college was
    taught like math. Terms were defined, in order to know what is being
    said. Look at math. It's as much about the definitions as the
    various rules.

    I believe electrical engineers frequently use the Dirac delta function,
    whose value is infinite at x=0 and zero everywhere else. When you
    convolve it with another function, you get back the function that you
    convolved with. But by standard mathematical formalism, there is no
    such function as the Dirac delta. That didn't stop physicists and
    engineers from using it anyway, starting around the 1920s(?).

    A rigorous mathematical formalism for such "functions" was not developed
    until the 1950s, called "generalized functions" or "Schwartz
    distributions". I'd be very surprised if any engineers ever study that.
    They just use the Dirac delta without worrying too much about
    mathematical rigor, and as long as they don't go too crazy, it acts the
    way they expect it to.

    If a term like algorithm is not well defined, then I won't use it in engineering unless I define it first.

    Engineering practice by others has not been so careful. You might find
    a "definition" of the Dirac delta in an engineering book, but if you
    look closely at the mathematical details, it will be hand-wavy or have
    gaps. They want mostly to agree on the features that they rely on, and
    to not worry about the corner cases that they don't use.

    Here's a definitional problem for you. Let's say you flip a coin and
    then cover it with your hand, so you don't see how it came up. What is
    the probability that it came up heads? One notion is that the
    probability is obviously 50%, assuming it was a fair coin. A
    conflicting notion is that it is obviously not 50%: the coin has already
    been flipped and has landed, so it is either definitely heads or else it
    is definitely tails. That is, the probability is either 100% or 0%, and
    you simply don't know which.

    Those two notions of probability are claled Bayesianism and frequentism. Philosophers have been arguing about them for 100s of years and they
    still haven't stopped. Mathematicians deal with this by treating
    probability as a topic in analysis involving measure spaces of yada
    yada. But a philosopher might say instead that probability is a
    sub-topic in the part of philosophy that deals with the nature of
    knowledge, and then the Bayesian view is simply a scheme for saying how strongly you believe a given proposition.

    What I'm saying here is that probability, like algorithm, is another one
    of those terms that doesn't have a single, universally accepted,
    rigorous definition, but that doesn't stop it from being important
    anyway.

    Re probability, this article is pretty good:

    https://ncatlab.org/nlab/show/Is+probability+theory+a+branch+of+mathematics%3F

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Rick C@21:1/5 to Dimiter on Wed Nov 2 15:26:55 2022
    On Wednesday, November 2, 2022 at 3:29:38 PM UTC-4, Dimiter wrote:
    On 11/2/2022 20:45, Rick C wrote:
    ...

    2) Input - ??? I want to say input is optional. So a set of steps
    to calculate some number of digits of pi would qualify as an
    algorithm, in spite of not having inputs.
    You could argue that there is always some kind of input. For example,
    you could say that the digit index or the number of digits is the input
    to the "calculate pi" algorithm.

    You can argue anything. A procedure to calculate pi without specifying how many digits would not be an algorithm because of the "finite" requirement. It doesn't need an input to set the number of digits if that is built into the procedure.
    Of course it is input. You change the number and the output changes.

    If it's part of the procedure, that's not input by definition.


    4) Finite - the steps must come to an end, i.e. at some point the
    algorithm has to produce the result, it can't be infinite.

    Is that really true?

    If you can accept a "calculate pi" algorithm that does not have an
    input, then it is not finite.

    Not true.
    Of course it is true. Unless you specify the limits for an operation
    which is infinite in nature you have the calculating algorithm
    running infinitely.

    The procedure can specify a number of digits. Why are you arguing about this???


    No need to go about calculating Pi, divide 1 by 3 using the
    algorithm taught at primary school using a pencil.

    Why bother with that? Just define it as 3 and be done!


    Without giving it much thought I'd say an algorithm is an unambiguous flowchart made up in order to achieve some goal. Anything more than that would take some qualifier to the word "algorithm".

    Must be nice to be the guy who makes the definitions.

    --

    Rick C.

    +- Get 1,000 miles of free Supercharging
    +- Tesla referral code - https://ts.la/richard11209

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From David Brown@21:1/5 to Rick C on Thu Nov 3 09:56:43 2022
    On 02/11/2022 19:37, Rick C wrote:
    On Wednesday, November 2, 2022 at 4:18:00 AM UTC-4, David Brown
    wrote:
    On 02/11/2022 06:42, Rick C wrote:
    On Wednesday, November 2, 2022 at 1:18:56 AM UTC-4, Paul Rubin
    wrote:
    Rick C <gnuarm.del...@gmail.com> writes:
    Anyone familiar with this?

    Meh, there is not really an accepted formal definition, and to
    some extent it is a philosophy question. The word algorithm
    comes from Arabic, but the modern notion of Turing machines is
    from the 1920s.

    There are a bunch of viewpoints on the topic here:

    https://en.wikipedia.org/wiki/Algorithm_characterizations

    A further take is here:

    https://plato.stanford.edu/entries/computer-science/#Algo

    If there's no definition, I guess I need to stop using the term.
    If you stopped using a term just because there is no single formal
    definition, you'd have to stop saying anything at all. After all,
    there's no definition for the terms "definition" or "term".

    It just means that if you are writing something that is supposed to
    be rigorous, you have to define what /you/ mean by a given term in
    the context of the discussion.

    Your idea that language is not defined is not valid. They are
    defined in a dictionary, actually, many, not unlike standards, lol.

    Dictionaries do not /define/ words, they /describe/ words. They say how
    words are used, and what people generally mean when they use the word.
    That is viewing the issue from the opposite side - use first, then
    definition. For technical terms in technical fields, you define the
    term first, then use it.

    Terms with jargon meaning need their specific definition in that
    context, or they have little value.

    Yes.

    But often the definition is not universally accepted - that's why in
    technical papers, standards documents, etc., important terms are defined
    in the document. They say exactly what you mean by the term in that
    context.

    So you can use the word "algorithm" loosely, and people usually know approximately what you mean. But if you want to be rigorous and rely on properties that are not always agreed upon, you must define you /you/
    will use the term in the particular context.


    Much of the engineering and computer science I learned in college was
    taught like math. Terms were defined, in order to know what is being
    said. Look at math. It's as much about the definitions as the
    various rules.


    Sure. But a lot of these things are simplifications, and sometimes they
    will be the particular choice of definition that suits the teacher or
    class. It applies to maths, it applies to engineering, computer
    science, and any other technical topic.

    You probably first learned that "integration" meant "finding the area
    under a curve". Then you later learn about complex analysis and
    integration in circles around poles of complex functions - the
    "definition" of integration that you learned now makes no sense
    whatsoever. (It is also meaningless for the Dirac delta function, since
    Paul brought that up.)

    When you learned about "algorithms" in your early computer science
    classes, you similarly got a simplistic view. Since my university
    course was more theoretical than yours (AFAIUI), I was probably given a
    more advanced and nuanced definition of "algorithm" - and I learned that
    there is no single definition. You can keep learning, keep
    investigating, keep thinking, and come up with new ways to define such
    terms, new subclasses or variants of "algorithm", new characteristics
    that can be useful or interesting. It is all "right" - the only thing
    that is /wrong/ is to claim that /your/ definition is /the/ definition.

    If a term like algorithm is not well defined, then I won't use it in engineering unless I define it first.


    If you need it to be well defined in the context of what you are doing,
    then yes, you need to define what you mean by it. If it doesn't need to
    be rigorously defined, then you don't need to do so.

    So if you are writing "This is the documentation for the algorithm used
    to control the speed of the motor" for some embedded software, then
    there is absolutely no need to define the term "algorithm". On the
    other hand, if you are writing a paper titled "Proof that there is no
    algorithm that can determine the halting state of all possible Turing Machines", then you absolutely /do/ need to define the term.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From David Brown@21:1/5 to Rick C on Thu Nov 3 10:25:55 2022
    On 02/11/2022 23:26, Rick C wrote:
    On Wednesday, November 2, 2022 at 3:29:38 PM UTC-4, Dimiter wrote:
    On 11/2/2022 20:45, Rick C wrote:
    ...

    2) Input - ??? I want to say input is optional. So a set of steps
    to calculate some number of digits of pi would qualify as an
    algorithm, in spite of not having inputs.
    You could argue that there is always some kind of input. For example,
    you could say that the digit index or the number of digits is the input >>>> to the "calculate pi" algorithm.

    You can argue anything. A procedure to calculate pi without specifying how many digits would not be an algorithm because of the "finite" requirement. It doesn't need an input to set the number of digits if that is built into the procedure.
    Of course it is input. You change the number and the output changes.

    If it's part of the procedure, that's not input by definition.


    By /your/ definition only.

    In Python (for familiarity), you could have :

    mult = lambda x, y : x * y
    mult3 = lambda x : mult(3, x)
    mult3_5 = lambda : mult3(5)

    It makes no difference to the theory or the meaning of "algorithm"
    whether the input is part of the function or not. (Of course it makes a
    major difference in practical use, but not for the theory.)

    If you want to learn more, the term to google about is "currying".


    4) Finite - the steps must come to an end, i.e. at some point the
    algorithm has to produce the result, it can't be infinite.

    Is that really true?

    If you can accept a "calculate pi" algorithm that does not have an
    input, then it is not finite.

    Not true.
    Of course it is true. Unless you specify the limits for an operation
    which is infinite in nature you have the calculating algorithm
    running infinitely.

    The procedure can specify a number of digits. Why are you arguing about this???


    This is a discussion, not an argument. You brought up a rather
    theoretical topic, and people have been discussing it beyond your
    current level of familiarity. The sensible reaction is to see what you
    can learn from this, or to accept that it is going beyond what you know
    about and you are dropping out. This is all very theoretical, with no practical application (that I can imagine) for the kind of work we comp.arch.embedded denizens do in real life - either you find these
    things interesting and fun to thing about, or you don't. (I hope you
    /do/ find it interesting - but I do understand that mathematics and computational theory is an unusual hobby, even for professional software
    and hardware developers.)

    You perhaps made the first post thinking there is a nice, simple and
    clear definition of "algorithm" and you'd get a nice, simple and clear
    answer to which clause you'd forgotten from that long-ago lecture. The
    list on Wikipedia of the dozens of mathematicians' definitions of
    "algorithm" through the years should have been a clue that things aren't
    that simple.



    No need to go about calculating Pi, divide 1 by 3 using the
    algorithm taught at primary school using a pencil.

    Why bother with that? Just define it as 3 and be done!


    Define 1 / 3 to equal 3? ": 1/3 3 ;" might be acceptable in Forth, but
    not in mathematics!


    Without giving it much thought I'd say an algorithm is an unambiguous
    flowchart made up in order to achieve some goal. Anything more than that
    would take some qualifier to the word "algorithm".

    Must be nice to be the guy who makes the definitions.


    We all get to make such definitions. The question is whether other
    people agree with them or not. Dimiter's suggestion here covers the
    lowest common denominator that I expect most people would agree with -
    even if it is also common to include other features (such as your own suggestions of input, output and finiteness).

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From George Neuner@21:1/5 to gnuarm.deletethisbit@gmail.com on Thu Nov 3 08:41:37 2022
    On Wed, 2 Nov 2022 11:37:18 -0700 (PDT), Rick C <gnuarm.deletethisbit@gmail.com> wrote:


    [David Brown's] idea that language is not defined is not valid. They
    are defined in a dictionary, actually, many, not unlike standards, lol.
    Terms with jargon meaning need their specific definition in that context,
    or they have little value.

    Actually dictionaries REFLECT usage of language rather than defining
    it. The meanings of words often differ with locality and often change
    over time.

    Unfortunately, many dictionaries are "abridged": they contain current
    and popular definitions - not necessarily ALL definitions - and
    pronunciation keys, but generally nothing of the origins or historical
    usages of the words.

    To get the history of a word you need either to find an an unabridged
    print dictionary, or in an online search you must include the term
    "etymology" (and then hope the information is not behind a pay wall). [www.etymonline.com is free and good for English, but it is limited
    and has entries for only a few 100,000 words.]

    YMMV,
    George


    [Last I checked (admittedly some years ago), Webster's unabridged
    English dictionary was 26 volumes. Oh, and that only covered the ~1.2
    million English words officially recognized at that time ... the
    dictionary of English slang was an additional 37 volumes. Altogether
    they filled up several shelves in the public library.

    Webster's online unabridged dictionary is pay walled.]

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)