• A new way to solve the `hardest of the h

    From ScienceDaily@1:317/3 to All on Tue Sep 21 21:30:38 2021
    A new way to solve the `hardest of the hard' computer problems
    Scientists develop the next generation of reservoir computing

    Date:
    September 21, 2021
    Source:
    Ohio State University
    Summary:
    Researchers have found a way to make what is called reservoir
    computing work between 33 and a million times faster, with
    significantly fewer computing resources and less data input needed.



    FULL STORY ==========================================================================
    A relatively new type of computing that mimics the way the human brain
    works was already transforming how scientists could tackle some of the
    most difficult information processing problems.


    ==========================================================================
    Now, researchers have found a way to make what is called reservoir
    computing work between 33 and a million times faster, with significantly
    fewer computing resources and less data input needed.

    In fact, in one test of this next-generation reservoir computing,
    researchers solved a complex computing problem in less than a second on
    a desktop computer.

    Using the now current state-of-the-art technology, the same problem
    requires a supercomputer to solve and still takes much longer, said
    Daniel Gauthier, lead author of the study and professor of physics at
    The Ohio State University.

    "We can perform very complex information processing tasks in a fraction
    of the time using much less computer resources compared to what reservoir computing can currently do," Gauthier said.

    "And reservoir computing was already a significant improvement on what
    was previously possible." The study was published today (Sept. 21,
    2021) in the journal Nature Communications.



    ========================================================================== Reservoir computing is a machine learning algorithm developed in the early 2000s and used to solve the "hardest of the hard" computing problems,
    such as forecasting the evolution of dynamical systems that change over
    time, Gauthier said.

    Dynamical systems, like the weather, are difficult to predict because
    just one small change in one condition can have massive effects down
    the line, he said.

    One famous example is the "butterfly effect," in which -- in one
    metaphorical illustration -- changes created by a butterfly flapping
    its wings can eventually influence the weather weeks later.

    Previous research has shown that reservoir computing is well-suited for learning dynamical systems and can provide accurate forecasts about how
    they will behave in the future, Gauthier said.

    It does that through the use of an artificial neural network, somewhat
    like a human brain. Scientists feed data on a dynamical network into a "reservoir" of randomly connected artificial neurons in a network. The
    network produces useful output that the scientists can interpret and
    feed back into the network, building a more and more accurate forecast
    of how the system will evolve in the future.



    ==========================================================================
    The larger and more complex the system and the more accurate that the scientists want the forecast to be, the bigger the network of artificial neurons has to be and the more computing resources and time that are
    needed to complete the task.

    One issue has been that the reservoir of artificial neurons is a "black
    box," Gauthier said, and scientists have not known exactly what goes on
    inside of it -- they only know it works.

    The artificial neural networks at the heart of reservoir computing are
    built on mathematics, Gauthier explained.

    "We had mathematicians look at these networks and ask, 'To what extent
    are all these pieces in the machinery really needed?'" he said.

    In this study, Gauthier and his colleagues investigated that question
    and found that the whole reservoir computing system could be greatly simplified, dramatically reducing the need for computing resources and
    saving significant time.

    They tested their concept on a forecasting task involving a weather
    system developed by Edward Lorenz, whose work led to our understanding
    of the butterfly effect.

    Their next-generation reservoir computing was a clear winner over today's
    state -- of-the-art on this Lorenz forecasting task. In one relatively
    simple simulation done on a desktop computer, the new system was 33 to
    163 times faster than the current model.

    But when the aim was for great accuracy in the forecast,
    the next-generation reservoir computing was about 1 million times
    faster. And the new-generation computing achieved the same accuracy with
    the equivalent of just 28 neurons, compared to the 4,000 needed by the current-generation model, Gauthier said.

    An important reason for the speed-up is that the "brain" behind this next generation of reservoir computing needs a lot less warmup and training
    compared to the current generation to produce the same results.

    Warmup is training data that needs to be added as input into the reservoir computer to prepare it for its actual task.

    "For our next-generation reservoir computing, there is almost no warming
    time needed," Gauthier said.

    "Currently, scientists have to put in 1,000 or 10,000 data points or more
    to warm it up. And that's all data that is lost, that is not needed for
    the actual work. We only have to put in one or two or three data points,"
    he said.

    And once researchers are ready to train the reservoir computer to make the forecast, again, a lot less data is needed in the next-generation system.

    In their test of the Lorenz forecasting task, the researchers could get
    the same results using 400 data points as the current generation produced
    using 5,000 data points or more, depending on the accuracy desired.

    "What's exciting is that this next generation of reservoir computing takes
    what was already very good and makes it significantly more efficient,"
    Gauthier said.

    He and his colleagues plan to extend this work to tackle even more
    difficult computing problems, such as forecasting fluid dynamics.

    "That's an incredibly challenging problem to solve. We want to see if
    we can speed up the process of solving that problem using our simplified
    model of reservoir computing." Co-authors on the study were Erik Bollt, professor of electrical and computer engineering at Clarkson University;
    Aaron Griffith, who received his PhD in physics at Ohio State; and
    Wendson Barbosa, a postdoctoral researcher in physics at Ohio State.

    The work was supported by the U.S. Air Force, the Army Research Office
    and the Defense Advanced Research Projects Agency.

    ========================================================================== Story Source: Materials provided by Ohio_State_University. Original
    written by Jeff Grabmeier. Note: Content may be edited for style and
    length.


    ========================================================================== Journal Reference:
    1. Daniel J. Gauthier, Erik Bollt, Aaron Griffith, Wendson
    A. S. Barbosa.

    Next generation reservoir computing. Nature Communications, 2021;
    12 (1) DOI: 10.1038/s41467-021-25801-2 ==========================================================================

    Link to news story: https://www.sciencedaily.com/releases/2021/09/210921081010.htm

    --- up 2 weeks, 5 days, 8 hours, 25 minutes
    * Origin: -=> Castle Rock BBS <=- Now Husky HPT Powered! (1:317/3)