Route of all evil. Yup, it is bad, but in many ways.
I was watching a project recently where the poor user was biten by premature optimization.
The developers spent so much time on making the application elegant and beautiful and future proof that:
- It took a lot longer to get it out there
- The users had to jump through hoops to do their own work, and there was delays in getting fresh content out there
- The iteration “make it easier for users” was always “just after the next thing”
If the user was considered first, someone usable would have been out there, and THEN the developers would have had to make it work with caching after the fact.
That being said, you obviously don’t want to be dumb with your architecture, making it hard to grow in the future. I have seen that happen too.
February 26th, 2007 at 4:05 am
According to Donald Knuth, it has to be “root of all evil” :-)
February 26th, 2007 at 7:14 am
Clearly Dion meant “the route to all evil”. : )
February 26th, 2007 at 7:38 am
Wow, I just wrote about the flip side of the coin: My point was to always strive for simplicity — for the user and for your code. Then, performance optimizations magically appear. Then, and only then, you should do them.
February 27th, 2007 at 8:46 am
The simplest possible code is usually the worst application design.
The curve going from simple to complex is a hump, where the y axis is quality.
Highest quality code comes from the simplest code that can accommodate new feautres or chages of current features without a lot of pain and without increasing the cost of maintenance.