Cloaking refers to showing search engine bots different content than you show visitors. Bots can be identified through IP detection, and are served keyword-stuffed text. Cloaking may also include redirects, as in BMW's case, a tactic that got the site banned from Google search.
But the concept of cloaking is not always taboo, consider the cases of IP delivery / geolocation and paywalled content.
IP delivery and geolocation tools either serve up location based content (by language or region) or redirect users to a localized site targeted to the user's country based on IP address or region, respectively. Google's position is, so long as you serve Googlebot the same content as you would any other visitor from that IP address or region, IP delivery and geolocation is not considered cloaking.
In the case of digital content publishers who sell piece-meal or by subscription, publishers can feed full-text content to Google and restrict access to visitors provided they allow the "first click free." This means a visitor can view one page or one multi-page article on a paywalled site before being asked to login/register or purchase the content/subscription. Allowing Google to index full text is a powerful way to attract new customers for paid content.
After last post, there may be some confusion about paid links. Paid links are not spam, so long as they are appended with the "nofollow" attribute. This indicates that the link is advertising, rather than organic, editorial links from the content publisher. Google can exclude such links from its PageRank calculation, and the publisher does not risk a loss of its own PageRank for the crime of schilling content.
What really prompted the search engines to crack down on paid links 3 years ago was the popularity of services like PayPerPost which served as a marketplace for content publishers and advertisers. For about $10-$100 (depending on the blog's authority), you could get your business or product reviewed with the anchor text of your choice. The blogger was required to to disclose the sponsorship somewhere in the post, but the nofollow attribute was not required on links. Once search engines agreed to crack down on this activity, thousands of blogs saw a drop in precious PageRank by association with the service.
While it's true nofollowed links will not benefit SEO, the increased branding and traffic you receive from sponsored posts and other paid links could lead to natural links that do count (including social links), and more importantly - more business - provided you target the right media.
Autogenerated Keyword Pages
Some black hatters use a script or software program to create thousands of keyword-targeted (and keyword stuffed) pages with very little "real" content, designed to rank highly rather than to provide value for visitors. Naturally, Google frowns upon this spammy technique.
But there are reasons to legitimately create a keyword-targeted page that can rank well in search engines. For example, the Bizrate shopping engine automatically creates custom search pages based on on-site searches from real customers. These pages are linked through "recent searches" or "related searches" tags throughout the site so they can be crawled, indexed and rank for long tail product searches.
Some may consider auto-generated category pages to be "gray hat" - pushing the boundaries of what's considered spam. Google recommends using robots.txt to "prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines" (emphasis mine). Personally, I believe these pages do add value. They direct searchers to the most relevant category page for what the searcher is looking for, and deserve to be in the search index. It only becomes spam if pages are allowed to be generated when they contain zero results.
Duplicate Content Across Domains
"Mirror sites" are identical or near-identical copies of a web site created with the intention to flood the search engine results pages and/or interlink between each other. This practices is undoubtedly black hat spam. But what about when you operate legitimate online stores that are localized for different English-speaking countries that share most or all of the same products and descriptions? Are these considered spam? Do you need to write separate product descriptions for each country-targeted domain/subdomain/subfolder?
While it's wise to take culture into account and adjust your category, product names and descriptions accordingly (e.g. "crockpot" in the US and "slow cooker" in the rest of the world), you're in the clear with localized websites, provided you geotarget them in Webmaster Tools. This also helps Google return the right TLD (top level domain) in localized search - i.e. your .com pages won't rank in Google.co.uk).
While you always want to steer clear of anything that will violate Google's Quality Guidelines, don't fear IP delivery, geolocation, paid links (with disclosure and nofollow attributes), search-based dynamic category creation or localized web sites - provided you do so within Google's recommendations, and consider what is best for your customers.