Ideally you would want Google to index all of the pages of your site, unless there are some pages in particular which you want Google and the other engines to ignore for whatever reason. This can be accomplished by setting up an HTACCESS file and using tne necessary syntax to discourage those pages from being indexed. In all likelihood, Google will still scan them, they just may not list them in the search results.
Getting Google to properly scan all of your pages, can be accomplished by making sure your site is completely accessible and free from broken links. You can use Google's Webmaster Tools to find areas where links are broken or pages are causing problems.
One dangerous technique that you should always avoid is known as "cloaking". This is an attempt to serve alternate pages to Google's bots vs. human visitors. In many cases, these cloaking scripts also generate thousands of fake pages on the fly whenever they detect a bot trying to index them - supposedly giving the appearance of a much larger and more massive site. The engines have wised up to this technique a long time ago, and it's a quick way to get your site banned and blacklisted.