The flood of news stories on the data-collection and online behavioral advertising (“OBA”) practices of search engines, mobile apps, brand advertisers, and social networks is giving many people a distinct feeling: “the creeps.”

Whether the stories are about concerns over Facebook going public, Google bypassing default browser settings, or Target figuring out a man’s daughter is pregnant before he does, the natural reaction is to picture the companies’ employees as shadowy, green-eyed peepers crouching in the darkness. I’m sure they’re just regular folks, but the image is inescapable.

The curious thing is that, aside from a general sense of creepiness, identifying the direct “harm” caused by OBA can be elusive. Courts struggle with this issue in many privacy lawsuits. Plaintiffs often fail because they can’t show legally cognizable harm. For their part, regulators are clearly unsettled by OBA, but even after getting comments from dozens of interested parties for a 2009 report, the FTC was unable to articulate whether or how OBA directly harmed consumers. Academics have done interesting research on the pros and cons of OBA, but the research has not yet translated into any consensus on acceptable practices.

Consequently, industry leaders, privacy advocates, and regulators have not established normative rules based on the harm caused by different forms of OBA. Instead, they have focused on creating a comprehensive “notice and choice” regime. Under this regime, consumers are meant to see how their data is used and choose whether they want to allow such use. This is great, assuming companies participate, but it ignores a critical real-world problem. When it comes to OBA, most consumers are disadvantaged by what experts call “knowledge asymmetry.” In other words, even if companies tell consumers exactly what data they’re collecting and how they’re using it, most people don’t have the expertise to understand the full implications. This reality challenges the notion of “informed consent” and suggests that “notice and choice” are not enough.

But in the absence of concrete harm, how do we distinguish OBA practices that are benign from those that are unacceptably intrusive? Unfortunately, public uproar over the latest privacy outrage tends to blur these distinctions. There are, however, at least seven factors that stand out as significant “creepiness” indicators. OBA that scores high on any of these factors should be scrutinized carefully and, at a minimum, industry leaders should consider establishing guidelines that discourage such practices.

Creep Factor No. 1: Linking Behavioral Data with Unique Identifiers

One of the most powerful ways to deliver targeted ads to consumers is to assign a unique identifier to individuals and track their online behavior across multiple sites, platforms, and apps. However, as Apple found with the outcry over its use of UDIDs (Unique Device Identifiers), this is also one of the practices consumers find most disturbing. Although Apple is eliminating the use of UDIDs from its development platform, app developers (and their marketing executives) are pushing hard to find alternatives. Some mobile marketing companies advocate the use of MAC addresses in lieu of UDIDs. Others have proposed an open source UDID alternative. Setting aside security concerns associated with some of the UDID alternatives (MAC addresses? Really?), the problem with these alternatives is they aren’t really any less creepy than the technology they seek to replace.

Creep Factor No. 2: Detail and Scope of Data Collection

Most people have some tolerance for “being watched.” After all, we’re social creatures, and we understand that, at some level, others will observe what we do and try to gain advantages from what they learn. But there’s a point at which data collection can make consumers feel like they’re trapped in a kind of Orwellian Panopticon. For example, if a data collection practice is both broad (i.e., relating to behavior in multiple contexts, like emailing, texting, web browsing, and voice calling) and granular (i.e., capturing details of the behavior, as in keystroke-logging), expect a sharp rise in the sale of tin-foil hats because consumers will do anything to avoid this kind of practice. Just ask companies like Phorm and NebuAd, who partnered with ISPs a couple of years ago to use deep-packet inspection technology to deliver targeted ads to users. If you want to know how that story ends, you can read all about it in the transcripts of the congressional hearing. Fun to read; not so fun to be there in person.

Creep Factor No. 3: OBA Based on “Negative” Assumptions

It’s hard to envision how regulators would address this Creep Factor because of its inherent subjectivity, but it’s still relevant. OBA is all about making assumptions based on known features of the consumer. However, these assumptions can have negative, positive, or neutral connotations. If the underlying assumptions are negative, consumers will likely find this intrusive. For example, if I’m a marathon runner, I’m perfectly fine getting targeted ads promoting the latest workout app. If I’m a pudgy couch potato…not so much. (I’m actually a 42-year old attorney who spends most of his day sitting in front of a computer monitor, so you can guess which scenario I identify with.) Consumers are much more likely to find OBA based on negative assumptions (e.g., you’re fat and need to workout) intrusive, not to mention tacky.

Creep Factor No. 4: Sensitivity of Data

There’s a reason the ancient penalty for peeping Toms was gouging out their eyes. Some data is so sensitive that, even if it’s anonymized, consumers will not tolerate its collection and use. For a notably creepy example, read the Wall Street Journal’s reporting on Neilson Co.’s practice of scraping a private online forum for discussion threads from people suffering from emotional disorders. Neilson was monitoring what consumers were saying about various pharmaceutical products on the forum. The information Neilson collected wasn’t tied to individuals and wasn’t used for direct marketing purposes. But when the story broke, you could almost hear consumers sharpening their stakes.

Creep Factor No. 5: Impact on Operability

This is one Creep Factor that courts view as a legally cognizable harm. If data collection and tracking technology significantly impacts the operability of users’ computers or mobile devices, as in the case of spyware, adware, and malware, the sense of intrusion can be overwhelming. Consumers will run, not walk, away from these kinds of practices.

Creep Factor No. 6: Ease of Opting Out

I just can’t help but use a tracking technology with the word “zombie” in it to illustrate one of the Creep Factors. And so-called “zombie cookies” warrant the attention. Zombie cookies are HTTP cookies that are automatically recreated (I prefer the word “respawned”—much creepier) after users attempt to delete them. This technology can make it virtually impossible for users to opt out of being tracked. Any company using zombie cookies to collect or monetize sensitive information is about as wholesome as John Hinckley, Jr.

Creep Factor No. 7: Lack of Notice

While the finer points of “layered” and “enhanced” notice are better left for discussion elsewhere, it’s safe to say that any OBA data-collection practice conducted with absolutely no consumer notice is seriously creepy. A good example of this is a practice called “device fingerprinting.” Device fingerprinting creates a unique identifier for computers, cell phones, and other devices based on a combination of externally observable characteristics like installed font styles, clock settings, TCP/IP configuration, etc. In addition to being problematic because it creates a persistent, unique identifier (see “Creep Factor No. 1”), device fingerprinting also raises red flags because it takes place with no consumer notice. This information is collected “passively,” and in most instances users can’t even detect that it’s happening.

Conclusion:

There are undoubtedly many other “Creep Factors,” but I’ve tried to identify the worst. The point is that not all data collection and OBA poses the same threat to consumers’ sense of personal privacy. By identifying specific practices likely to be viewed as intrusive, industry leaders, trade organizations, and regulatory bodies may find it easier to determine the level of notice required, or whether some practices should be prohibited outright. These criteria may also be useful for companies developing OBA and tracking technologies who want to build sustainable businesses.

After all, nobody likes a creep.