The question talks about two things, partial site indexing and partial page indexing.
Yes, you can use robots.txt
or meta name="robots"
to indicate what you do not want to index .
In fact if the content is otherwise protected against access nor need it. If the page is only accessible by password, the content will not be indexed.
This is actually the only effective way to prevent indexing, since saying that you do not want something indexed is just a convention. An indexer may not respect this. Google respects today, but it may not respect when it wants. There are malicious crawlers.
Obviously, only the access control done on the server will be effective.
All this is well known fact. I think the biggest question is partial indexing of the page.
This is usually done by identifying that the client requesting the page is an indexer. It generates a different page with partial content when it is a known indexer. Obviously it is possible to deceive the site by saying that it is the indexer. Then the indexer receives the whole page and can index all the content, but a normal client receives the page with the cover. This can lead to penalties in indexing if the mechanism identifies the maneuver. Of course, it will always be possible to access content through the indexer cache.
Obviously you can submit all content and limit via JavaScript. This does not protect anything, just cheats, since the content is there. It may be difficult for the layman, but he has no protection.
I'd like to say about the myth that crawlers run JavaScript. Yes some perform, but not all. And they can not simulate user actions like a real user does, so do not count on indexing if there is interaction with the user or another way that depends on things that the indexer is not able to do, and always appear new things that the indexer is not able to simulate. The programmatic code exists on the page just to define non-standard streams, and this by definition makes it impossible in practice to try to simulate everything that might occur.
If you want to protect the content, just controlling it on the server. And obviously it will control the display, does not prevent the person from copying and posting elsewhere, even automatically. It's good to make it clear why some people think Santa does.