userbinator 2 days ago | link
No, I don’t think the things most people do with computers should require any faster hardware; the problem is that software is often being written to require increasingly more resources under the false assumption that processing speed and memory are “infinite” or close to it.The exponential growth that started many decades ago has promoted a culture of extreme waste. From the earliest notions of “premature optimisation”, and the rise of structured programming and OOP with its many-layered abstractions, to the latest trend of ultra-high-level frameworks and the web-application movement, there is this constantly present notion that “abstractions and computing power is free, but programmer time is expensive”. Although opposition to this seems to have increased in the recent years, it’s still a prevalent attitude and being taught currently in many schools. People are being forced to frequently upgrade their hardware (with the associated waste and manufacturing costs) just so they can run the latest versions of software – often to do the same things at the same speeds they were doing them before. It’s likely not too far of a stretch to say that software on average is now a few orders of magnitude larger and slower than it should be.
This trajectory follows similarly to the early part of what happened to the car industry – fuel was initially cheap so manufacturers (and consumers) concentrated little on fuel efficiency, but starting in the 70s oil shortages made for some pretty rapid changes as people became aware that what they were doing was not sustainable. There has been much growth in interest in efficient hardware recently, which is good, but the other part of the equation, software, is also very important. Thus I think anyone who still believes in that mantra about programmer time, when working on software intended for a large number of users, is as absurd as someone in the car industry saying “engineer time is expensive, but fuel is cheap”. Processors may be getting limited by the laws of physics but I don’t think many programmers have reached the limits of their brainpower yet. :-)
Depends on exactly what you mean. You can keep adding cores until the cows come home. And I don’t really buy the “we don’t know how to use all those extra cores” argument. Multi-threaded code isn’t the rocket science it’s portrayed to be in the press.
One thing that may become practical is die stacking, depending on what they can do about extra heat.