There is no foolproof method that will prevent Google from knowing that your pages exist. Google knows that pages exist as soon as it finds a link to them. It is not possible to prevent linking to your private pages from every public site. Any of your users can create links.
You also say that you want to prevent problems with duplicate content. In that case, either noindex or robots.txt disallow should work fine. Either would prevent Google from indexing the duplicate content. However even if you did allow Google to index duplicate content, Google rarely penalizes for it. When Google finds duplicate content, it usually just chooses one of the two copies to put in its search index. If you are syndicating content from somewhere else, they may require that you hide your copy from Google so there is no possibility that it is chosen for index instead of theirs. See What is duplicate content and how can I avoid being penalized for it on my site?
nofollow
<a href="http://example.com/some-page.html" rel="nofollow">
Google just announced that nofollow is now a hint rather than something they will 100% respect. Don't rely on nofollow to prevent Google from seeing links.
noindex
<meta name="robots" content="noindex">
<meta name="googlebot" content="noindex">
X-Robots-Tag: noindex
If Google finds a meta tag with noindex or a HTTP header with noindex, it will not put that page into its search index. That page will not appear on Google. Note that noindex won't prevent Google from knowing the page is there. In fact, Googlebot will crawl the page and see all the contents. However, it will not put the page into the search index.
robots.txt
Disallow: /some-page.html
If you disallow a page in robots.txt Google won't crawl the page. Even when Google finds links to the page and knows it exists, Googlebot won't download the page or see the contents. Google will usually not choose to index the URL, however that isn't 100%. Google may include the URL in the search index along with words from the anchor text of links to it if it feels that it may be an important page.
password protection
AuthType Basic
AuthName "Password Required"
AuthUserFile /directory/.htpasswd
Require valid-user
If you truly have sensitive content on your site, it is best to password protect it. Again, password protection won't prevent Google from knowing the page exists, but Google won't be able to crawl the content, nor will it include the pages in the search index.