The only thing that has ever really worked for me was following the principals of Interaction Design. That is, following this work-flow:
Requirements Gathering / Goal Definition > User Personas / Wire-frames / Test Cases > Graphic Design / Development > Testing based on Test Cases.
Basically, you should have the whole interface defined in step 2, along with test cases that define what the expected results for important features should be. Then after (or while) you build the application, you use the test cases and the wire frames to ensure it looks like what it was meant to.
Testing individual classes / functions can help you with future-proofing, but most projects are measured by the quality of the current release, not the ease at which you can fix what is wrong with the current release.
As for progress, as much as people want there to be some metric you can measure that will tell you if a project took too long or went too fast, there really is none. Things tend to take as long as they need to, if you try to reduce that time, the final product isn't as good. However, having the threat of a looming deadline is a powerful motivator. It is a delicate balance.
I really haven't ever found any tools that promoted better software in the end, at least as far as measuring quality goes. There are loads of tools for making it take less time to reach the goal, but as for evaluating the success of that goal, nothing really beats a pre-game wire frame / test case document, and a post-game evaluation of how much they match.
Don't ever let anyone tell you that you can't design a whole project in great detail without code. You are ENSURED a mess if you don't do it.