• More of my philosophy of how productivity is so important and about how

    From Amine Moulay Ramdane@21:1/5 to All on Mon Apr 18 17:33:38 2022
    Hello,


    More of my philosophy of how productivity is so important and about how
    to scale productivity..

    I am a white arab from Morocco, and i think i am smart since i have also invented many scalable algorithms and algorithms..


    I have passed two certified IQ tests and i have scored above 115 IQ.


    I think the most important thing so that to make a country
    much more richer is to increase much more "productivity" that incraases much more the GDP of a country, but of course we can use artificial intelligence and automation to increase much more productivity, but the so important thing to also ask is how to "
    scale" productivity, and it is what i am answering below, but first read the my following thoughts so that you understand:

    So read the following so that to notice it:

    "Compelling data reveal a discouraging truth about growth today. There
    has been a marked decline in the ability of traditional levers of
    production capital investment and labor to propel economic growth."

    And read the following:

    "Accenture research on the impact of AI in 12 developed economies
    reveals that AI could double annual economic growth rates in 2035 by
    changing the nature of work and creating a new relationship between man
    and machine. The impact of AI technologies on business is projected to
    increase labor productivity by up to 40 percent and enable people to
    make more efficient use of their time."

    Read more here so that to notice it:

    https://www.accenture.com/ca-en/insight-artificial-intelligence-future-growth-canada

    And McKinsey estimates that AI(Artificial intelligence) may deliver an additional economic output of around US$13 trillion by 2030, increasing
    global GDP by about 1.2 % annually. This will mainly come from
    substitution of labour by automation and increased innovation in
    products and services.

    Read more here:

    https://www.europarl.europa.eu/RegData/etudes/BRIE/2019/637967/EPRS_BRI(2019)637967_EN.pdf

    And you can read my thoughts about artificial intelligence and productivity and about China and its artificial intelligence and computer chips in the following web link so that to also understand
    how artificial intelligence will increase much more productivity:

    https://groups.google.com/g/alt.culture.morocco/c/UOt_4qTgN8M

    And now i will ask a philosophical question:


    How to manage efficiently complexity ?


    I think you can manage complexity by the “divide and rule” approach
    to management, which also leads to hierarchical division of large organisations, or wich also leads to the Division of "labour", you can read more about the Division of labour here:


    https://en.wikipedia.org/wiki/Division_of_labour


    Also you can manage complexity by using constraints, such as laws, road rules and commercial standards, all of which limit the potential for harmful interactions to occur, also you can manage complexity by using higher layers of abstraction such as in
    computer programming, and we can also follow the efficient rule of: "Do less and do it better" that can also use higher level layers of abstraction to enhance productivity and quality, this rule is good for productivity and quality, and about
    productivity, i invite you to read the following thoughts about productivity from the following PhD computer scientist:


    https://lemire.me/blog/about-me/


    Read more here his thoughts about productivity:


    https://lemire.me/blog/2012/10/15/you-cannot-scale-creativity/


    And i think he is making a mistake:


    Since we have that Productivity = Output/Input


    But better human training and/or better tools and/or better human smartness and/or better human capacity can make the Parallel productivity part much bigger that the Serial productivity part, so it can scale much more (it is like Gustafson's Law), and it
    looks like the following:


    About parallelism and about Gustafson’s Law..


    Gustafson’s Law:


    • If you increase the amount of work done by each parallel
    task then the serial component will not dominate
    • Increase the problem size to maintain scaling
    • Can do this by adding extra complexity or increasing the overall
    problem size


    Scaling is important, as the more a code scales the larger a machine it
    can take advantage of:


    • can consider weak and strong scaling
    • in practice, overheads limit the scalability of real parallel programs
    • Amdahl’s law models these in terms of serial and parallel fractions
    • larger problems generally scale better: Gustafson’s law


    Load balance is also a crucial factor.


    So read my following thoughts about the Threadpool to notice that my Threadpool that scales very well does Load balance well:


    ---


    About the Threadpool..


    I have just read the following:


    Concurrency - Throttling Concurrency in the CLR 4.0 ThreadPool


    https://docs.microsoft.com/en-us/archive/msdn-magazine/2010/september/concurrency-throttling-concurrency-in-the-clr-4-0-threadpool


    But i think that both the methodologies from Microsoft of the Hill Climbing and of the Control Theory using band pass filter or match filter and discrete Fourier transform have a weakness, there weakness is that they are "localized" optimization that
    maximize the throughput , so they are not fair, so i don't think i will implement them, so then you can use my following invention of an efficient Threadpool engine with priorities that scales very well (and you can use a second Threadpool for IO etc.):


    https://sites.google.com/site/scalable68/an-efficient-threadpool-engine-with-priorities-that-scales-very-well


    And here is my other Threadpool engine with priorities:


    https://sites.google.com/site/scalable68/threadpool-engine
  • From Citizen@21:1/5 to All on Tue Apr 26 17:41:33 2022
    Amine Moulay Ramdane kirjutas Teisipäev, 19. aprill 2022 kl 02:33:41 UTC+2:
    Hello,


    More of my philosophy of how productivity is so important and about how
    to scale productivity..

    I am a white arab from Morocco, and i think i am smart since i have also invented many scalable algorithms and algorithms..


    I have passed two certified IQ tests and i have scored above 115 IQ.


    I think the most important thing so that to make a country
    much more richer is to increase much more "productivity" that incraases much more the GDP of a country, but of course we can use artificial intelligence and automation to increase much more productivity, but the so important thing to also ask is how to
    "scale" productivity, and it is what i am answering below, but first read the my following thoughts so that you understand:

    So read the following so that to notice it:

    "Compelling data reveal a discouraging truth about growth today. There
    has been a marked decline in the ability of traditional levers of
    production capital investment and labor to propel economic growth."

    And read the following:

    "Accenture research on the impact of AI in 12 developed economies
    reveals that AI could double annual economic growth rates in 2035 by changing the nature of work and creating a new relationship between man
    and machine. The impact of AI technologies on business is projected to increase labor productivity by up to 40 percent and enable people to
    make more efficient use of their time."

    Read more here so that to notice it:

    https://www.accenture.com/ca-en/insight-artificial-intelligence-future-growth-canada

    And McKinsey estimates that AI(Artificial intelligence) may deliver an additional economic output of around US$13 trillion by 2030, increasing global GDP by about 1.2 % annually. This will mainly come from
    substitution of labour by automation and increased innovation in
    products and services.

    Read more here:

    https://www.europarl.europa.eu/RegData/etudes/BRIE/2019/637967/EPRS_BRI(2019)637967_EN.pdf

    And you can read my thoughts about artificial intelligence and productivity and about China and its artificial intelligence and computer chips in the following web link so that to also understand
    how artificial intelligence will increase much more productivity:

    https://groups.google.com/g/alt.culture.morocco/c/UOt_4qTgN8M

    And now i will ask a philosophical question:


    How to manage efficiently complexity ?


    I think you can manage complexity by the “divide and rule” approach
    to management, which also leads to hierarchical division of large organisations, or wich also leads to the Division of "labour", you can read more about the Division of labour here:


    https://en.wikipedia.org/wiki/Division_of_labour


    Also you can manage complexity by using constraints, such as laws, road rules and commercial standards, all of which limit the potential for harmful interactions to occur, also you can manage complexity by using higher layers of abstraction such as in
    computer programming, and we can also follow the efficient rule of: "Do less and do it better" that can also use higher level layers of abstraction to enhance productivity and quality, this rule is good for productivity and quality, and about
    productivity, i invite you to read the following thoughts about productivity from the following PhD computer scientist:


    https://lemire.me/blog/about-me/


    Read more here his thoughts about productivity:


    https://lemire.me/blog/2012/10/15/you-cannot-scale-creativity/


    And i think he is making a mistake:


    Since we have that Productivity = Output/Input


    But better human training and/or better tools and/or better human smartness and/or better human capacity can make the Parallel productivity part much bigger that the Serial productivity part, so it can scale much more (it is like Gustafson's Law), and
    it looks like the following:


    About parallelism and about Gustafson’s Law..


    Gustafson’s Law:


    • If you increase the amount of work done by each parallel
    task then the serial component will not dominate
    • Increase the problem size to maintain scaling
    • Can do this by adding extra complexity or increasing the overall
    problem size


    Scaling is important, as the more a code scales the larger a machine it
    can take advantage of:


    • can consider weak and strong scaling
    • in practice, overheads limit the scalability of real parallel programs • Amdahl’s law models these in terms of serial and parallel fractions • larger problems generally scale better: Gustafson’s law


    Load balance is also a crucial factor.


    So read my following thoughts about the Threadpool to notice that my Threadpool that scales very well does Load balance well:


    ---


    About the Threadpool..


    I have just read the following:


    Concurrency - Throttling Concurrency in the CLR 4.0 ThreadPool


    https://docs.microsoft.com/en-us/archive/msdn-magazine/2010/september/concurrency-throttling-concurrency-in-the-clr-4-0-threadpool


    But i think that both the methodologies from Microsoft of the Hill Climbing and of the Control Theory using band pass filter or match filter and discrete Fourier transform have a weakness, there weakness is that they are "localized" optimization that
    maximize the throughput , so they are not fair, so i don't think i will implement them, so then you can use my following invention of an efficient Threadpool engine with priorities that scales very well (and you can use a second Threadpool for IO etc.):


    https://sites.google.com/site/scalable68/an-efficient-threadpool-engine-with-priorities-that-scales-very-well


    And here is my other Threadpool engine with priorities:


    https://sites.google.com/site/scalable68/threadpool-engine-with-priorities


    And read my following previous thoughts to understand more:


    About the strategy of "work depth-first; steal breadth-first"..


    I have just read the following webpage:


    Why Too Many Threads Hurts Performance, and What to do About It


    https://www.codeguru.com/cpp/sample_chapter/article.php/c13533/Why-Too-Many-Threads-Hurts-Performance-and-What-to-do-About-It.htm


    Also I have just looked at the following interesting video about Go scheduler and Go concurrency:


    Dmitry Vyukov — Go scheduler: Implementing language with lightweight concurrency


    https://www.youtube.com/watch?v=-K11rY57K7k


    And i have just read the following webpage about the Threadpool of microsoft .NET 4.0:


    https://blogs.msdn.microsoft.com/jennifer/2009/06/26/work-stealing-in-net-4-0/


    And as you are noticing the first web link above is speaking about the strategy of "work depth-first; steal breadth-first" , but we have to be more smart because i think that this strategy, that is advantageous for cache locality, works best for
    recursive algorithms, because a thread is taking the first task and after that the algorithm is recursive, so it will put the childs tasks inside the local work-stealing queue, and the other threads will start to take from the work-stealing queue, so the
    work will be distributed correctly, but as you will notice that this strategy works best for recursive algorithms, but when you you iteratively start many tasks, i think we will have much more contention on the work-stealing queue and this is a weakness
    of this strategy, other than that when it is not a recursive algorithm and the threads are receiving from the global queue so there will be high contention on the global queue and this is not good. MIT's Cilk and Go scheduler and the Threadpool of
    Microsoft and Intel® C++ TBB are using this strategy of "work depth-first; steal breadth-first". And as you are noticing that they are giving more preference to cache locality than scalability.


    But in my following invention of a Threadpool that scales very well i am giving more preference to scalability than to cache locality:


    https://sites.google.com/site/scalable68/an-efficient-threadpool-engine-with-priorities-that-scales-very-well


    Other than that when you are doing IO with my Threadpool, you can use asychronous IO by starting a dedicated thread to IO to be more efficient, or you can start another of my Threadpool and use it for tasks that uses IO, you can use the same method
    when threads of the my Threadpool are waiting or sleeping..


    Other than that for recursion and the stack overflow problem you can convert your function from a recursive to iterative to solve the problem of stack overflow.


    Other than that to be able to serve a great number of internet connections or TCP/IP socket connections you can use my Threadpool with my powerful Object oriented Stackful coroutines library for Delphi and FreePascal here:


    https://sites.google.com/site/scalable68/object-oriented-stackful-coroutines-library-for-delphi-and-freepascal




    Thank you,
    Amine Moulay Ramdane.

    Some of your words sound really wise.




    Mister Kristjan Robam

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)