Software developers involved in the industry during the 1970s had severe limitations on disk space and memory. Every byte and clock cycle counted, and much ingenuity went into avoiding waste. The extra time spent by programmers to avoid this waste translated directly into smaller, tighter software products, and hence was seen to translate directly into sales revenue.
However, technological advances have multiplied processing capacity and storage density by orders of magnitude and reduced the cost per MIPS and cost per bit by similar orders of magnitude (see Moore's Law).
Additionally, the spread of computers through all levels of business and home life has produced a software industry many times larger than it was in the 1970's.
As a result, the emphasis in software design is argued to have shifted away from tightness of design, cleverness of algorithm and thriftiness of resource usage. Instead, time-to-market has been seen to become the key.
The extra time needed to fully optimize software always delayed time-to-market, losing some revenue. But the improvement in quality due to optimization is thought to win this revenue back - however, it is now the case that the lost revenue due to delaying time-to-market far exceeds the increase in revenue that almost any optimization can produce.
The software industry has responded to this changing tradeoff by emphasized rapid code development by automating programming tasks that had previously been areas of fine craftsmanship, and re-automating on top of that. The result is multiple layers of software abstraction resting on top of each other, and the task of the modern software programmer often consists more of administering automatic code generators and pre-written components to their will than in the fine handling of software to be completely optimized - though, with the establishment of well-founded, stable, optimized and dependable software toolkits, this enables functional code to be created much faster and more powerful than coding up equivalents by hand, where development time would be significantly longer. A case in point is NeXT's OpenStep Foundation Kit and Application Kit - a set of reusable objects that enabled developers to create functional and usable code faster than conventional methods.
Some hold that the result of modern rapid application development practices, foregoing the optimization practices of the past, is that modern software running on faster computers does not present a user impression that is significantly faster - this user impression, due to the consumption of underlying technical advances by layers of software abstraction in pursuit of time-to-market, is the essence of bloatware. Unfortunately the abolition of this software abstraction can hamper the underlying development of the program. Software structures that are well crafted in place to allow for easy extensibility and maintenance will assist software developers in that upgrading existing code will be simpler and faster.
However, the optimization at the machine code level needs not be done by hand. Modern compilers often take optimization of code into consideration, and this foregoes the need for hand-manipulation of assembly code. Naturally, this software optimization is never one-hundred percent perfect, but then the resulting effect from a programmer making the optimized code fully optimized is negligible.
An example of this bloating is to look at video games throughout time: based upon existing ROMs that have been converted from the original assembly language and machine code (see MAME), the 1980 arcade game Robotron is approximately 84Kbytes, the Atari 2600 home video game Yars' Revenge is about 4Kbytes, DOOM, released in the mid-1990s, is approximately 12 megabytes, whilst today's Unreal Tournament, is about 1.9 gigabytes in size.
This however is probably an unrealistic comparison; the visuals and gameplay of Unreal Tournament, for example, has been improved a hundredfold over the arcade games on the 1980s.
See also: feature creep.