I have worked on dozens of applications in my career but I'm still waiting to see an application which would become faster over time. On the contrary, left to their own devices, applications tend to get progressively slower until they eventually become useless. Every new feature, every new piece of data, and every additional user contributes to additional complexity which necessarily results in lower performance unless the available resources expand accordingly. And until cloud computing delivers on its ultimate promise of elasticity of resources, getting more horsepower is not just the question of more money but it is even more constrained by time and effort of all the people involved in the development of the application.
If you want to avoid the death spiral of increasingly slower performance, the first step is to start measuring the performance and visualize the trend for everybody to see. It's surprisingly difficult for everybody to agree on how fast something is performing, but if you get everybody to start looking at the same chart you can at least agree that the trend isn't looking good. But as Jiras, Bugzillas, and Tracs of the world well testify just noting that something is wrong is hardly a recipe for successful resolution of problems. Unless effort is invested in optimizing performance and fixing issues, the system will eventually grind to a halt which will require a major push to be resolved.