Interesting article about deterioration trends in software development
-
-
I think efficiency is a general problem in computing, not just in software.
I use PC from before the internet and I remember very well the time where I used Works in MS DOS and where a HD with 250 megabytes was the must have, that had endless capacity.
Today the PCs are infinitely more powerful and HD of 1 Tb the norm.
But it turns out that as a whole the applications are not even a bit faster and in an HD, where the clusters have also increased in size, tlinhey do not have as much capacity anymore.
To use a simile with a car, say the VW Golf, comparing the performance of the first model with a modern one, we will see that neither performance nor spending is much difference, since in the new models there is more technology and active safety that makes the car more powerful and the engine more efficient with the same gasoline cost, but it also weighs almost twice the first model, which in the end nullifies these advantages.Efficiency in software: I remember a small game, kkrieger that caused a certain sensation. A first-person 3D FPS, with quite acceptable graphics and even a sound and ambient music, although this, of short duration, which can be completed in a few minutes.
But what caused a sensation was that the entire game only occupied 96 Kb. -
Vivaldi is good example of such software, especially compared to O12.
-
I wonder if things would be different if the developers of bloated software only had low-spec devices to build and test on.
I also agree with a lot of what is said about webpages being completely bloated - I often see those "blank divs" that get filled a while after the page has loaded. You know what's happening there? The webpage is loading everything client-side instead of server side. I'm not convinced that Google's AMP is the way to go, but reducing the amount of ads and doing more server side prep where possible would be a good start.
Upcoming versions of android / google play are going to switch out android app bloat - instead of a package containing every app, they'll just be built against each specific device, which should hopefully help reduce the size of a package.
I am also really annoyed with the "best effort" attitude some software developers have (one that comes to mind is AAA game developers releasing games with week 1 patches). I have often seen the concept that you can just test by releasing your product to users, and fix bugs after the fact. Thanks to the internet any developer could, but not enough of them have stopped to think if they should.
Similarly, I hate the concept of having to muddle through the quirks of a new workflow - the article mentions struggling with node modules at a point.
One point I disagree with in the article is the idea of using heavy frameworks to develop apps. Yes, there are some cases where you end up over-engineering things to the point of distraction, but I also recognise that this can make it easier to develop (and in turn reduce bugs, assuming there are none in the framework itself) - just look at vivaldi. You could have developed vivaldi without libraries like react.js, but would it have made as much progress using just bare bones JS? I don't think so.
-
@tveastman: I have a Python program I run every day, it takes 1.5 seconds. I spent six hours re-writing it in rust, now it takes 0.06 seconds. That efficiency improvement means I’ll make my time back in 41 years, 24 days
You’ve probably heard this mantra: “programmer time is more expensive than computer time”.
This just got me wondering if we will have enough powerful & advanced AI one day so that it could take care of the software optimization.
Imagine this: a programmer creates some app, hands it over to the AI and the AI rewrites the app completely so that it has exactly the same functionality but uses fewer resources and generally works faster. This could be insignificant for some simple programs, like notepad, but it can be a huge improvement for the more advanced apps, like games.
-
@pafflick This is basically what "the computer" does in star trek. An officer says what they wont done (a natural language specification), and the computer does all the work of programming and finding the result (we would assume in the most efficient way).
Both of the tasks in bold seem nigh on impossible to do in reality though. It might be possible to do the first one, but the second one seems like the travelling salesman problem on steroids.
-
Programmed obsolescence. (Built machines, budget usually between 250-750€ [depends on money left, mainly])
Noticed a lot with:
- MS office (kept 2010 patched. Newer are too much slow. Should I migrate to OpenOffice or WPS);
- Heavy graphic editors (like PaintShopPro, I dropped it);
- Transition from 98SE to XP (well, pretty regular, NT was known to require more resources, but it was way way better);
- Transition from XP to Seven (most on the explorer side, ugly thumb cache, still apply to win10);
- Transition from XP to Vista (rollbacked to XP with 3rd party AERO, a lot of system hooks and it was still faster).
- Firefox >2.0 (gecko was becoming old);
- Android (sub-optimized. Is becoming hard or annoying to root in OEM forms);
- Most antivirus (most of them can kill performance);
- HDD (I had to buy an SSD to avoid to wait SIX hour for a windows update xD). Well, system is 20/30% fast. But only on C:\ and I can't afford a lot of them as still quite expensive.
- CPU: Intel VS Amd. I fear windows tend to work slightly better with intel. But I still prefer Amd (they change socket less often and are usually cheaper -- even if they are hotter).
- GPU: A too much old GPU could kill the global performance, as almost any app now use the GPU and can easily saturate the CPU if not "powerful".
-
I agree with the article, Software is bloated with useless features and dirty code constantly. Developers are lazy and lack passion, the industry brute forces fixes with hardware (instead of actually fixing the problem).
I spend hours/days revising the code to make sure it's clean, fast and stable. In some cases I rewrite whole sections, and avoid using dependencies.
We already use Machine Learning to help clean code, but it's not enough. Developers need to change and stop being lazy, if they lack passion and the will to learn, they're in the wrong business.
As he said, we have the ability to do better.
-
@kobi True. Well optimized software still keep similar performances over the time.
-
I have in my PC enough software which is considered essential, such as free Office, Gimp and others.
But I have an old PC and not very powerful and they are quite slow and heavy and therefore I use less and less these applications, using excellent applications online, much more simple to use, quick and covering almost 99% of what I need,
I think that there are many users who do not need much of the functions that incorporate modern applications and makes them unnecessarily heavy and slow, without offering significant advantages for it...
I don`t need 5000 functions and formats to write a letter and not 8000 templates, filters and brushes, if I am not a professional designer.
I mentioned before the MS Works, version Light of the MS Office, Windows included free to XP and that was a real delight, offered the same applications of MS Office (wordprocessor, spreadsheet, database, etc) with fewer features but sufficient for the vast majority of user. It was quick and very light, but died hidden within the system, while many users were aware of its existence, MS promoted its MS Office in first line and not mentioned this small and free beauty. -
This post is deleted! -