It’s in the linked paper, beginning with this paragraph
Why Forecasts are Improving
Key developments in observation, numerical modeling, and data assimilation have enabled these advances in forecast skill. Improved observations, particularly by satellite remote sensing of the atmosphere and surface, provide valuable global information many times per day. Much faster and more powerful computers, in conjunction with improved understanding of atmospheric physics and dynamics, allow more-accurate numerical prediction models. Finally, improved techniques for putting data and models together have been developed…
Sensitivity to initial conditions varies greatly in space and time, and an important but largely unsung advance in weather prediction is the growing ability to quantify the forecast uncertainty by using large ensembles of numerical forecasts that each start from slightly different but equally plausible initial states, together with perturbations in model physics.
I recently read a profile of a conservative or alt-right bigwig (possibly Robert Mercer or someone else related to Cambridge Analytica ) who did scientific programming before he started a hedge fund. The story he tells is that he got disillusioned with government when speeding up a program by ten times didn’t result in a 90% reduction in costs-- instead, the users simply increased the resolution of their program so as to spend the same amount of money.
and here in this very paper, we have a validation of the maxim-- science is advanced by (intelligently) consuming the available computer power.
That’s just programming in general. Usage expands to fit the available resources, and programming bloats to do the same. Which is why you now need 16GB of RAM, a 3GHz processor, and a TB of drive space to edit the same text file you used to edit with a 640k RAM, an 8MHz processor, and a 720k floppy drive. Same reason people used to complain that it took a long time to download a song (which you can now download way quicker than it plays) and now they complain that it takes a long time to download a 4k video.
In some circumstances, such as the web, it seems more of a design-push than user-pull though. I certainly didn’t ask for all the “modern (sic) web design.”
“Flat”, for example, doesn’t work for anyone actually doing work in a UI that has at least as much complexity as UIs had in the Nineties. 3D may not be the best way, but making objects look like objects? What a concept!
Where did this flat shit come from? I want to say Gmail.
You had 640k RAM and a 3-1/2" DD floppy drive? Wow! I’ll bet you even had a 10 Mb HDD! (My old Corona PC had 512k and 2 x 5-1/4" DD floppy drives, no HDD - lots of fun when writing and compiling C programs: boot up with DOS floppy, swap for a data floppy; insert MicroEmacs floppy in 2nd drive, swap out for compiler. It really was fun in a boneheaded way…)
We were taught in a seminar I attended that skeumorphic and 3d were no longer required because audiences/users were now sufficiently computer literate to not need those conventions to understand a UI.
To which my response is:
Windows 2.1 and 3.0
Mac Classic
Ergo BS, QED.
The teacher was too young to know either of those, and didn’t know his computer history very well either – he mentioned a text game based on another text game, and when I mentioned they were both based on Adventure from the 1960s, he had no idea what I was talking about (and assumed I was confused, grrr).
At work we joke about bevels coming back in fashion next. You never know.
Don’t they understand the definition of skeumorphic? Fake wood paneling with no function is one thing, but they flatten things that are active separate controls and right next to each other!
Are these guys that ignorant? Do they never actually do actual work outside Visual Studio? The answer is no.
Nevermind 4k video, have you seen the size of some of these javascript frameworks?
Man, I sure hope he never finds out about highway construction and traffic congestion. Or fuel efficiency standards and gas consumption. Or productivity gains and hours worked. Where will he go when he’s disillusioned with the private sector?
No, just the one single floppy drive. At least you had 2 drives. Imagine “Please insert disk #2. Please insert disk #1.” every few minutes. Mine also started with 512k of RAM but I eventually upgraded it.
Another fond memory - finding a TSR that let me load and switch between 3 different programs. You had to manually allocate RAM to the 3 available slots, and it was not multitasking, but it let me keep a couple of small utilities in RAM without having to exit the program and switch disks. I think one was EDLIN and the other was probably arj or pkzip.
The Corona was a peculiar system in that, if I wanted anything like monochrome graphics, I had to go back to the original OEM DOS 1.1 disks. The Corona’s monochrome graphics were proprietary. (Go figure!) To do any serious text work, though, I used generic DOS 3.3, and I ditched EDLIN as soon as I bloody well could. C programming has enough traps for the unwary when you can see entire functions in front of you, let alone when trying to visualise them one line at a time.