Drew Crawford recently wrote an excellent technical article on why web apps are slow, you can read it here:
The article really drives home the fact that when, as a developer, you abdicate responsibility for doing the hard stuff such as memory management, delegating that to your programming language or programming environment, you end up with code that will be not only be suboptimal in general, it will have side effects, like garbage collecting, when you least want it.
To give an example: At the time of user interaction (when the user is doing something with the application, like browsing photos, entering information, etc.), the application usually has more stuff on screen (as opposed to when the application is idle). Consequently more memory is used at this time (more stuff on screen equals more memory). And because the application has exceeded some memory utilisation threshold, it is at this time the garbage collector will kick in and clean up. Garbage collection is very CPU intensive, meaning it competes with the rest of the application, such as the user interface code, for CPU time. But because this is when there is user interaction you want the user interface to have as much CPU as it needs to be at its smoothest.
Some scripting languages, for example Lua, provide a crude way of controlling the Garbage Collector (collect now or don’t collect now), so the collection could be deferred until a better time. However the risk is that the application runs out of memory at the wrong time. In any case, memory management is only one aspect of Crawford’s article.
In the world of Set Top Boxes and TVs, which have even less processing power than mobile phones, this is not only relevant but critical. If the User Experience needs to be silky smooth you need to milk every last cycle out of your CPU.