Decades ago, the time a program spent running on an electronic computer was pretty expensive. Little bit of history: Back then they were called “electronic”, as a “computer” used to be a woman with good math skills who calculated logarithm tables and trajectories of ballistic weapons. One of the first jobs of their electronic counterparts was to do the same calculations faster and with better precision. Programmer time, on the other hand, was comparatively cheap. Programs were written in a very careful way, making sure not to waste precious CPU time or what little memory was available.
Later, computers became more mainstream. Now if was not only the big government agencies, but large businesses and even universities could get enough money together to buy an electronic computer. Students needed a special permission to gain access to the sacred device, mostly by turning in punch cards and getting a printout of the result (and CPU time taken, i.e. cost) the next day, or, if they were really lucky, by accessing the system on a terminal. There could be multiple “dumb terminals” connected to one of these “Big Iron” things humming away in the basement, with everyone sharing time slices.
Suddenly, the time of personal computing came along. Now everyone could buy a microcomputer that could be plugged into any standard socket, and people like you and me typed away at night in front of a glowing screen. I fondly remember the hours I spent on my Commodore 64 in the early 80s. Good times. And no one shed tears over the run time of a program — if my interpreted BASIC source of a Mandelbrot demo took a whole night to finish, who cares. At that point programmer time became more expensive than CPU time, and while people still spent lots of time making sure their programs ran fast, it was not as imperative as before. Reusing code became more important, and libraries were written that encapsulated common problems. Those libraries were not as finely tuned to the specific problem at hand, mind you, but more general, so they could be used for a variety of problems. People who used libraries created larger probrams, as the library code had to be included, and, by and large, the programs ran more slowly.
PCs started to appear on the desks of office workers, some of them still connected to big servers in the basement, but more and more people started using “their own” computer at work. Software companies began to fiercly compete against each other, and getting a new program out faster than the others became paramount. More and more programmers started to use “easy” programming systems like Visual Basic to write programs that were quite slow and used more resources than their carefully crafted counterparts, but could be sold earlier. So what if it was three floppies instead of one.