700 GB is not very huge in the SQL Server world any more, just large.
A good document to follow is here, which shows you a methodology to follow.
The main contenders for causing poor performance are lack of memory , poor maintenance, and poor SQL statements.
It is hard to be precise about memory requirements without knowing your workload, but you should have a minimum of 8 GB and potentially 32 GB with a database of this size. You really need a 64-bit server to make the most of the resources available.
Rebuilding indexes can often greatly help performance. Sometime manually updating statistics between index rebuild periods can help. Definitely avoid shrinking your database, as this will undo any benefit from index rebuilds, and give you NTFS fragmentation that will harm performance until you do a NTFS Defrag.
Poor query statements often fall into those that can be fixed by adding indexes and those that need rewriting. The document shows ways of finding if additional indexes would help. Also try Google for more advice. Do not just add indexes in the hope that performance will improve. Use the facilities available in SQL Server to help show you if the index will really be useful. If you add an index and it does not help then remove it.
Start with tempdb, make sure that is sized correctly,
Then create a prefmon and look at how the database is running as far a memory(Buffer), cpu and disk I/O.
Also look at your explain plans and make sure you are not doing full table scans,