A scientific study several years ago indicated that the best way for people to lose weight was for them to have friends who were dieting. The impact of peer pressure on behavior has long been measured. Now, according to an article in CNN Money, a number of companies like Lenddo, Kreditech and Kabbage, are trying to bring this "peer pressure" mentality to the measurement of credit risk. It goes a long way towards answering the ultimate privacy question, "If I am not doing anything wrong, why should I care about privacy?"
The new credit reporting companies use data analytics to measure a consumer's likelihood of default by measuring not only his or her personal factors, but also the factors of that person's contacts, friends and associates on social networking sites like Facebook, LinkedIn and Twitter. It measures not only the identity of a consumer's contacts, but the strength of his connections to each contacts, and determines his creditworthiness based upon the creditworthiness of his associates.
For retailers using those new-style credit reports, that probably sounds like a much more complete picture of the customer's likely behavior. There's just one problem: It may not answer the question of whether the new reports actually say anything about how credit-worthy the customer really is.
The concept is that birds of a feather tend to flock together, and if your friends are credit-worthy, you are likely to be as well. Of course, the converse is true as well—if your buddies are deadbeats (remember college, anyone?), you probably are too.
In addition to measuring friends, companies can use an increasingly detailed portrait of a consumer in determining not only whether or not to extend credit, but also the interest rates at which to extend credit. The federal Fair Credit Reporting Act requires consumer credit reports to be accurate, and gives consumers certain rights with respect to their use.
But the term "consumer credit report" is likely broader than most people anticipate. It is "any information [used] by a consumer reporting agency bearing on a consumer's credit worthiness, credit standing, credit capacity, character, general reputation, personal characteristics, or mode of living which is used or expected to be used or collected in whole or in part for the purpose of serving as a factor in establishing the consumer's eligibility for credit..."
These companies that scour Facebook postings and Twitter feeds are likely to be considered "consumer reporting agencies" under the FCRA, which are defined as any person "which, for monetary fees, dues, or on a cooperative nonprofit basis, regularly engages in whole or in part in the practice of assembling or evaluating consumer credit information or other information ... for the purpose of furnishing consumer reports to third parties ..." Since they collect this information to help others determine a customer's creditworthiness, they are regulated by the FCRA.
What that means then is that, if you're a consumer, to the extent that a consumer reporting agency uses an analysis of your friends' credit scores, your website postings, your twitter feed or other data to determine your credit risk, the FCRA requires that information to be accurate, and provides consumers with certain rights with respect to that information.
But what does it mean for the information to be "accurate?" Does it mean that you are, in fact, Facebook "friends" with Louis the lowlife and Dave the deadbeat? Or that Louis and Dave are in fact bad credit risks? Or that their status is an accurate predictor of your behavior?
Suppose some company determined (through big-data analysis) that people who wore black high-top Converse Chuck Taylor sneakers were 18 percent more likely to default on a mortgage than those who wore Adidas, and then combed your Instagram feed for pictures of your footware. Suddenly, for reasons unknown to you, you get a call from American Express stating that your credit limit has been lowered (what the FCRA calls an "adverse action") based upon information it received from a credit reporting agency.While you are entitled to challenge the accuracy of the information (not my feet, not my sneakers), are you permitted to challenge the causal connection between tennis shoes and default rates? Probably not. That's the mystery inherent in things like FICO Next Gen, Pinnacle and Precision credit scoring. While we know the factors that go into the score generally (credit extended, payment, delinquency, etc.), we don't know how these things are weighted.
With more "big data analytics" and with an exponential increase in the amount and type of data being considered, we will see companies using things like how often we post travel pictures (and where we are traveling), our public disclosures about medical conditions either of ourselves or our families (FCRA limits the use of medical information for credit, but it's not clear that this applies to information posted on social media), records that indicate risky social behavior, or even things like the kinds of clothes we are wearing, or how often we go out as indicators of credit worthiness.
They can link social media information with payment information, credit information, employment information, purchasing history (not only from credit and debit card transactions, but directly by linking to merchants like Amazon, Apple, eBay and others) and get a more "complete" picture of us. To the extent they can get location data (either from the cell phone companies, or through a web app indirectly—think Progressive insurance's Snapshot program) this can be another factor in evaluating credit score or credit risk.
It's not just what the credit reporting agencies know (or think they know) about you. It's about how they rate and score it. That's the beauty and danger of big data. It may very well be that those Chuck Taylor wearers are bad credit risks, or that hanging out with losers make you a loser—very high school cafeteria stuff. But too much hangs in the balance to rely on mere coincidence.
I believe that credit reporting agencies and others who make decisions based on "big data" should be required to tell you what they base those decisions on—not only the underlying data, but also the algorithm that weights this data. Sure, people would alter their behavior based on this knowledge, but in the end, isn't this what we want them to do? Engage in behavior that makes them a better credit risk?
And people should automatically get copies of their credit reports from the three big credit reporting agencies, Experian, TransUnion, and Equifax without requesting them, but also should get copies of the myriad specialized data analysis done by companies like Telecheck, ChoicePoint, Acxiom, Integrated Screening Partners, Innovis, the Insurance Services Office (ISO), Tenant Data Services, LexisNexis, Retail Equation, Central Credit, Teletrack, the Medical Information Bureau (MIB a/k/a, MIB Group, Inc.), UnitedHealth Group (Ingenix Division), and Milliman.
These companies parse things like how many checks you write, how you utilize medical services, whether you have ever been sued or prosecuted, whether you have filed insurance claims, etc., to determine whether you get a job, get credit, get medical treatment, or get insurance. Consumers should know what they know. Or what they think they know. Automatically.
All of this is why consumers need to be worried about privacy even if they are doing nothing wrong. Companies will increasingly determine whether you get a job or how much you are paid (or charged) based on information that it gleans from the myriad public and semi-public databases that exist and which will be created. More data means more inaccurate data—and more opportunities for abuse.
And you can bet that these inaccuracies and abuses will not generally be to the benefit of the consumer. Not when there is money to be made. So Congress (remember Congress?) may have to step in and regulate how companies can use social networking and other information. Until then, stick to wearing New Balance.
If you disagree with me, I'll see you in court, buddy. If you agree with me, however, I would love to hear from you.