Environment & Energy
Related: About this forumUS builds the world's fastest supercomputer. Know what it's doing? Modeling climate change.
David Fahrenthold RetweetedThe US government has quietly built the world's fastest supercomputer. And you know what it's doing? Modeling climate change. @tsimonite
Link to tweet
tymorial
(3,433 posts)Surely the president and the republicans wouldnt lie about it. They are our elected leaders. They look out for us!
Couldn't resist
Doodley
(9,151 posts)at140
(6,110 posts)and what I know for sure is, the software can be only as good as the programmers.
I have worked on a slow ass IBM 1620 on to must faster IBM 360 and on to faster
Burroughs computer. The faster computer helped me progress faster in development
of software which generated correct results. But if the algorithms were faulty,
faster computers did not correct them.
Boomer
(4,170 posts)Do you have reason to believe that current climate models use faulty algorithms? Is your accurate but vague generality somehow specifically tied to this project?
Given the sheer number of factors that affect climate, I would expect that increased computing power will be useful in adding more forcing variables to the models. And the faster a model is run, the faster climate scientists can compare the output to known results and fine-tune the algorithms.
From what I know of their methodology, the models are developed by trying to predict climate that has happened in our past. If the results match what actually happened, then you have at least some verification on the models ability to predict future climate events. It's not like they're just making this up and accepting the results without question. And so far, climate predictions that have been made based on those models are proving to be pretty reliable. Always room for improvement, but it beats flying blindly into the future.
at140
(6,110 posts)if the current models are good and reliable, the faster computer will not change the results.
So if you missed my main point, based upon 4 decades of developing engineering and manufacturing automation software, speed of computers is not as important as the validity of algorithms used.
Boomer
(4,170 posts)I'm not sure why you keep arguing a point that no one was trying to make. The obvious advantage of a super computer is speed. Given the complexity of climate change models, more work can be done if the models can be run faster and especially if you have greater access to a that super computer.
We're dealing with an emergency situation, so adding more speed to research progress is a big deal, even if the accuracy is the same. But running more models, more often, means that progress in accuracy is accelerated. This is a win-win situation.
at140
(6,110 posts)as something important, when super computers are already super speed.
littlemissmartypants
(22,839 posts)It is a discussion of the use of speed combined with artificial intelligence which addresses the algorithm issue you bring up.
Summit, which occupies an area equivalent to two tennis courts, used more than 27,000 powerful graphics processors in the project. It tapped their power to train deep-learning algorithms, the technology driving AIs frontier, chewing through the exercise at a rate of a billion billion operations per second, a pace known in supercomputing circles as an exaflop.
at140
(6,110 posts)And that is my only retort. Emphasize better modelling as the prominent feature. That would make me happy. A faulty model run at higher computer speeds will generate faulty results faster.
OKIsItJustMe
(19,938 posts)Dr. Jeff Masters · January 8, 2019, 11:52 AM EST
Above: An August 2018 monsoon forecast for India, shown at left by a global weather model operating at 13-kilometer resolution. At right, the new IBM Global High-Resolution Atmospheric Forecasting System (GRAF) operates at 3-km resolution, showing much more detail, and updates 6 to 12 times more often than the current top global forecast models. Image credit: IBM.
Consider: One of the chief tasks of early digital computers was the "modeling" of trajectories:
http://zuse-z1.zib.de/simulations/eniac/history.html
At the time of World War II intelligent bombs were not yet developed, so ground based artillery was used to attack the enemy. Depending on the distance of the target and the type of artillery, the bullet has to be shoot in a certain angle. This angle also is related to the weather, especially to the wind. To know the correct angle in the specific situation, the artillery men used so-called firing tables. But those firing tables had to be computed first.
Those ballistic computations were done at the Moore School of Electrical Engineering, part of the University of Pennsylvania, too.
Calculating a trajectory could take up to 40 hours using a desk-top calculator. The same problem took 30 minutes or so on the Moore School's differential analyzer. But the School had only one such machine, and since each firing table involved hundreds of trajectories it might still take the better part of a month to complete just one table. [1]
The speed up in developing new artillery designs caused an increased need of computation power. In November 1942 US forces landed in French North Africa, and entered a terrain, which was entirely different from what they had met before. The existing firing tables turned out as completely useless. That made the computation power totally to the bottleneck of the war machinery.
Under these circumstances John Mauchly, a member of Moore School's Engineering, Science, and Management War Training (ESMWT) program, wrote a first five-page memo called The Use of Vacuum Tube Devices in Calculating. In this paper he suggested a machine that would add 5,000 10-digit numbers per second and would be more than 100 times faster than the fastest computer at that time (the fastest computer in 1942 was a mechanical relay computer operating at Harvard, Bell Laboratories with 15-50 additions per second [1]).
As processor speeds increased, the accuracy of the models increased. At this point, computer-guided munitions can instantaneously calculate trajectories, but in reality, it's still just modeling.
It may be worthwhile to review some papers on climate models through the years:
- The GISS Model of the Global Atmosphere (1974)
- Greenhouse Effects due to Man-made Perturbation of Trace Gases (1976)
- Greenhouse Effect of Trace Gases, 1970-1980 (1981)
- Climate Impact of Increasing Atmospheric Carbon Dioxide (1981)
- Efficient Three-Dimensional Global Models for Climate Studies: Models I and II. (1983)
- Climate sensitivity: Analysis of feedback mechanisms (1984)
- Ice melt, sea level rise and superstorms:/ evidence from paleoclimate data, climate modeling, and modern observations that 2 C global warming could be dangerous (2016)
If you read through this progression, you'll see that the models have always been constrained by processor speed.
NickB79
(19,276 posts)Funny, you trotted out one of the climate denier's most popular arguments a few days ago (it's not humans, it's the sun), tried to back it up with another denier talking point when called on it (what about the Dust Bowl?), and here you are sounding like you are repeating another one of their favorite arguments (the models are junk).
Curiouser and curiouser.......
https://www.democraticunderground.com/100211747334#post11
GeorgeGist
(25,324 posts)are in denial.