Identity verification, fraud detection, and other risk analytics services are becoming major services provided by private companies that are not under any form of supervision.
Those interested in the current status of tracking and surveillance practices in the private sector should read the paper “Corporate Surveillance in Everyday Life” by Wolfie Christl. On pages 32 and further information is provided on risk analytics services and companies providing these. The introduction:
With the rise of technology-mediated services, verifying consumer identities and preventing fraud have both become increasingly important and challenging issues, especially in the light of cybercrime and automated fraud. At the same time, today’s data-driven identity and fraud analysis systems have aggregated giant databases with sensitive information about entire populations, ranging from names, addresses, and relationships between people to extensive behavioral and biometric data. Many of these systems are operated by private companies and cover a wide range of use cases, including proof of identity for financial services, assessing insurance and benefits claims, analyzing payment transactions, and evaluating a large variety of online transactions. They decide whether an application or transaction is accepted or rejected, or which payment and shipping options are available for someone during an online transaction. Commercial services for identity verification and fraud analytics are also used in areas such as law enforcement and national security. The line between commercial applications of identity and fraud analytics and those used by government intelligence services is increasingly blurring.
The paper mentions companies like Experian, Acxiom, Equifax, Verizon, LexisNexis, Trustev, TransUnion, Arvato, Symantec, ThreatMetrix, AdTruth and many others. Some of these companies collect biometric information on persons (including fingerprints and voice recognition). Probably governments already have much more, page 34:
In 2016, a report found that a facial recognition database used by the FBI contained photos of half of all adults in the US, without their knowledge or consent.
Device fingerprinting seems to be a common practice to follow users over different devices. All these parties collect social security numbers (seems to be allowed in these countries, different from the Dutch rules regarding BSN).
AML / CFT information
In the report there is no specific information on the parties providing information services to combat money-laundering and terrorist financing. Probably the same parties as mentioned in the report provide those services.
- Announcement by the publisher, Cracked Labs
- Report “Corporate Surveillance in Everyday Life” by Wolfie Christl (pdf)
Addition of 17 July 2017
Theodore Rostow, a student of Yale wrote November last year “What Happens When an Acquaintance Buys Your Data?: A New Privacy Harm in the Age of Data Brokers“. This regards the US, abstract:
Data brokers have begun to sell consumer information to individual buyers looking to track the activities of romantic interests, professional contacts, and other people of interest. The types of data available for consumer purchase seem likely to expand over the next few years. This trend invites the emergence of a new type of privacy harm, “relational control” — the influence that a person can exert on another in their social or professional networks using covertly acquired private information.
Addition 21 December 2017
Erik Kamenjasevic wrote two articles on profiling in the financial sector under the GDPR (1, 2). In the second article he describes bank automatically processes a customer’s data for e.g.:
- fraud detection and prevention
- credit risk scoring
- cash flow forecasting
Banks have to assess carefully their obligations under GDPR:
However, even when one of the three exceptions applies and a bank delivers automated decisions to its customers, paragraph 3 of Article 22 of the GDPR imposes further obligations and requires the bank to implement suitable measures in order to safeguard the rights, freedoms and legitimate interests of a data subject.