Advancement in technology is one of humanity's great hopes.
I stopped trying to keep up with the bleeding edge of technology in 1996 after I bought a Pentium Pro @200 MHz with a 1MB L2 cache. I paid $1,000 for the processor alone. I kept that PC untill 1999 when I wanted to run Direct X applications which were unbearably slow.
Since then I've found that the best thing to do is when you're forced to upgrade for one performance reason or another is to buy the most reasonable 2nd tier technology available at that instant and drive it til the wheels fall off. I'm sure by today's standard my Intel Core2 Duo 6600 @2.4GHz is a POS but it does all I need it to do.
After about a dozen times around the high performance technology ferris wheel the bleeding edge of technology loses it's appeal. No matter what you buy it loses half its value once you drive it off the lot and in 6 months will be obsolete. It's not a matter of money because I can afford any system that I could ever want its just a matter of the ennui of who cares.
In any case my desire for the highest performance hardware is highly diminished by the crap software most companies put out these days. If you look at it honestly you have to acknowledge that over my working life which is a little bit in excess of 30 years processor, memory, video, cache, system bus, or any other measure of hardware performance that you wish to choose has increased by at least 1,000 times. Generally from MHz to a corresponding value of GHz.
Over tha same period by about the most generous allowance overall system performance you can grant is perhaps 10 times more performant than 30 years ago and that's being very generous. Viewed this way software has effectively wasted hardware performance improvements of over 100 times. In other words for every doubling of hardware performance, software is able to effectively utilize only 1 percent.
Why should I spend my money on improved hardware so that the latest software can waste 99% of the improvement?