Jump to content

User:Privacy4353/sandbox

From Wikipedia, the free encyclopedia

Week 12: Cross-device tracking[edit]

Cross-device tracking refers to technology which enables the tracking of users across multiple devices such as smartphones, television sets, smart TVs, and personal computers.[1]

More specifically, cross-device tracking is a technique in which technology companies and advertisers deploy trackers, often in the form of unique identifiers, cookies, or even ultrasonic signals, to generate a profile of users across multiple devices, not simply one.[2] For example, one such form of this tracking uses audio beacons, or inaudible sounds, emitted by one device and recognized through the microphone of the other device.[2]

This form of tracking is utilized primarily by technology companies and advertisers who use this information to piece together a cohesive profile of the user.[2] These profiles inform and predict the type of advertisements the user receives.[2]

Background[edit]

There are many ways in which online tracking has manifested itself. Historically, when companies wanted to track users’ online behavior, they simply had users sign in to their website.[1] This is a form of deterministic cross-device tracking, in which the user’s devices are associated with their account credentials, such as their email or username.[3] Consequently, while the user is logged in, the company can keep a running history of what sites the user has been to and which ads the user interacted with between computers and mobile devices.[3]

Eventually, cookies were deployed by advertisers, providing each user with a unique identifier in his or her browser so that the user’s preferences can be monitored.[4] This unique identifier informs the placement of relevant, targeted ads the user may receive.[4] Cookies were also used by companies to improve the user experience, enabling users to pick up where they left off on websites.[5] However, as users began utilizing multiple devices––up to around five––advertisers became confused as to how to track, manage, and consolidate this data across multiple devices as the cookie-based model suggested that each device––whether a phone, computer, or tablet––was a different person.[4]

Other technologies such as supercookies, which stay on computers long after the user deletes his or her cookies, and web beacons, which are unique images from a URL, are also used by trackers and advertisers to gain increased insight into users’ behavior.[4] However, advertisers were still limited in that only one device was able to be tracked and associated with a user.[4]

Thus, cross-device tracking initially emerged as a means of generating a profile of users across multiple devices, not simply one.

One such tactic for cross-device tracking is called browser fingerprinting, and occurs when browsers, which are modifiable to the users’ tastes, produce a unique signal that companies or advertisers can use to single out the user.[4] Browser fingerprinting has been a cause for concern because of its effectiveness and also since it does not allow for users to opt-out of the tracking.[4]

Another tactic used by Google is called AdID and works on smartphones in tandem with cookies on a user’s computer to track behavior across devices.[1]

Now, cross-device tracking has evolved into a new, radical form of surveillance technology which enables users to be tracked across multiple devices, including smartphones, TVs, and personal computers through the use of audio beacons, or inaudible sound, emitted by one device and recognized through the microphone of the other device, usually a smartphone.[2] In addition, cross-device tracking may presage the future of the Internet of Things (IoT), in which all types of devices––such as offices, cars, and homes––are seamlessly interconnected via the internet.[1]

Ultrasonic tracking[edit]

Humans interpret sound by picking up on different frequencies.[2] Given the variety of sound waves that exist, humans can only hear frequencies that are within a certain range––generally from 20Hz to 20kHz. Interestingly, by the age of 30, most humans cannot hear sounds above 18kHz.[2]

Ultrasound, which emits shorter wavelengths greater than or equal to 20kHz, enables the rapid transmission of data necessary for cross-device tracking to occur.[2]

Another integral component of cross-device tracking is the usage of audio beacons. Audio beacons are beacons that are embedded into ultrasound, so they cannot be heard by humans.[2] These audio beacons are used to surreptitiously track a user’s location and monitor online behavior by connecting with the microphone on another device without the user’s awareness.[2]

Applications[edit]

Studies have shown that 234 Android applications are eavesdropping on these ultrasonic channels without the user’s awareness.[2]

Applications such as Silverpush, Shopkick, and Lisnr are part of an “ultrasonic side-channel” in which the app, often unbeknownst to the user, intercepts ultrasonic signals emitted from the user’s environment, such as from a TV, to track which advertisements the user has heard and how long the person listened to them[2].

  • Silverpush­­––the leading company utilizing this technology––patented software enabling them to track TV ads based on audio stream above[2]
  • Shopkick, another popular application, gives discounts to users who shop at stores which emit these ultrasonic beacons, allowing them to create a profile of the user[2]
  • Lisnr utilizes a user’s location data in tandem with ultrasonic beacons to give users coupons related to their activities[2]

Another study suggested that Apple, Google, and Bluetooth Special Interest groups need to do more to prevent cross-device tracking.[6]

Privacy and surveillance concerns[edit]

Ultrasonic tracking[edit]

Ultrasonic tracking technologies can pose massive threats to users’ privacy. There are four primary privacy concerns associated with this new form of tracking:

  • The first is media tracking: audio from the user’s television may be detected by the microphone in the user’s mobile device, allowing malicious actors to gain access to what the user is watching––particularly if it is salacious.[2] Advertisers can similarly gain insight into what a user typically watches.[2] In both scenarios, a user’s real-world behavior is linked to their online identity and used for tracking.[2]
  • Another form of tracking permitted by ultrasonic tracking is cross-device tracking, which enables a user’s profile to be connected across multiple devices based on proximity.[2] This form of tracking, in linking different devices, can help advertisers show more targeted ads or open individuals to attacks by malicious actors.[2]
  • Location tracking is yet another privacy concern.[2] Indeed, ultrasonic signals can convey location information via a location identifier, often placed in stores or businesses.[2]
  • Lastly, this new ultrasonic tracking poses a threat to users of Bitcoin and Tor because it deanonymizes users’ information, since ultrasonic signals associate the user’s mobile phone with the Bitcoin or Tor account.[2]

Panoptic surveillance and the commodification of users' digital identity[edit]

From cookies to ultrasonic trackers, some argue that invasive forms of surveillance underscore how users are trapped in a digital panopticon, similar to the concept envisioned by Jeremy Bentham: a prison in which the prisoners were able to be seen at all times by guards but were unable to detect when, or even if, they were being watched at all, creating a sense of paranoia that drove prisoners to carefully police their own behavior.[7] Similarly, scholars have drawn parallels between Bentham’s panopticon and today’s pervasive use of internet tracking in that individuals lack awareness to the vast disparities of power that exist between themselves and the corporation to which they willingly give their data.[7] In essence, companies are able to gain access to consumers’ activity when they use a company’s services.[7] The usage of these services often is beneficial, which is why users agree to exchange personal information.[7] However, since users participate in this unequal environment, in which corporations hold most of the power and in which the user is obliged to accept the bad faith offers made by the corporations, users are operating in an environment that ultimately controls, shapes and molds them to think and behave in a certain way, depriving them of privacy.[7]

In direct response to the panoptic and invasive forms of tracking manifesting themselves within the digital realm, some have turned to sousveillance: a form of inverse surveillance in which users can record those who are surveilling them, thereby empowering themselves.[8] This form of counter surveillance, often used through small wearable recording devices, enables the subversion of corporate and government panoptic surveillance by holding those in power accountable and giving people a voice––a permanent video record––to push back against government abuses of power or malicious behavior that may go unchecked.[8]

The television, along with the remote control, is also argued to be conditioning humans into habitually repeating that which they enjoy without experiencing genuine surprise or even discomfort, a critique of the television similar to that of those made against information silos on social media sites today.[9] In essence, this technological development led to egocasting: a world in which people exert extreme amounts of control over what they watch and hear.[9] As a result, users deliberately avoid content they disagree with in any form––ideas, sounds, or images.[9] In turn, this siloing can drive political polarization and stoke tribalism.[9] Plus, companies like TiVO analyze how TV show watchers use their remote and DVR capability to skip over programming, such as advertisements––a privacy concern users may lack awareness of as well.[9]

Some scholars have even contended that in an age of increased surveillance, users now participate online through the active generation and curation of online images––a form of control.[10] In so doing, users can be seen as rejecting the shame associated with their private lives.[10] Other scholars note that surveillance is fundamentally dependent upon location in both physical and virtual environments.[11] This form of surveillance can be seen in travel websites which enable the user to share their vacation to a virtual audience.[11] The person’s willingness to share their personal information online is validated by the audience, since the audience holds the user accountable and the user vicariously experiences pleasure through the audience.[11] Further, users' mobile data is increasingly being shared to third parties online, potentially underscoring the regulatory challenges inherent in protecting users' online privacy.[12]

In addition, scholars argue that users have the right to know the value of their personal data.[13] Increasingly, users’ digital identity is becoming commodified through the selling and monetizing of their personal data for profit by large companies.[13] Unfortunately, many people appear to be unaware of the fact that their data holds monetary value that can potentially be used towards other products and services.[13] Thus, scholars are arguing for users’ to have increased awareness and transparency into this process so that users can become empowered and informed consumers of data.[13]

Surveillance capitalism[edit]

The increased usage of cross-device tracking by advertisers is indicative of the rise of a new era of data extraction and analysis as a form of profit, or surveillance capitalism, a term coined by Shoshana Zuboff.[14] This form of capitalism seeks to commodify private human experience to create behavioral futures markets, in which behavior is predicted and behavioral data is harvested from the user.[14] Zuboff suggests that this new era of surveillance capitalism eclipses Bentham's panopticon, becoming far more encroaching and invasive as, unlike a prison, there is no escape, and the thoughts, feelings, and actions of users are immediately extracted to be commodified and resold.[14] Thus, since cross-device tracking seeks to create a profile of a user across multiple devices, big tech companies, such as Google, could use this behavioral data to make predictions about the user’s future behavior without the user’s awareness.[14]

Scholars are beginning to discuss the possibility of quantifying the monetary value of users’ personal data. Notably, the algorithms used to extract and mine user data are increasingly seen as business assets and thus protected via trade secrets.[13] Indeed, the usage of free online services, such as public Wi-Fi, often comes at the unknown cost to the user of being tracked and profiled by the company providing the service.[13] In essence, a transaction is occurring: users’ personal data is being exchanged for access to a free service.[13] Increasingly, scholars are advocating for users’ right to understand the fundamental value of their personal data more intimately so as to be more savvy, informed consumers who have the ability to protect the privacy of their online information and not be manipulated into unwittingly giving away personal information.[13]

Health and wellness applications[edit]

In addition, health and wellness applications also have a dearth of privacy protections as well: a study found that many health apps lacked encryption and that regulators should enforce stronger data privacy protections.[15] The study stated that of the 79 apps they tested, none of the applications locally encrypted the users’ personal information and 89% of the applications pushed the data online.[15] The lack of adequate privacy and security measures surrounding users’ personal medical data on mobile applications underscores the lessening degree to which users can trust mobile app developers to safeguard their personal information online.[15] While mobile application developers continue to confront privacy and security concerns, users are increasingly looking to ways to visualize their data through wearable devices and applications that track their workout and exercise routines.[16] Indeed, researchers discovered that these self-tracking devices play a role as a tool, a toy, and a tutor in users’ lives.[17] In the tool role, the self-tracking device functions as a mechanism to help the user in some capacity, often to achieve personal health goals.[17] The toy role underscores how some self-tracking users see it as a fun game, particularly with regard to rewards and viewing the visualized data.[17] Lastly, the tutor role reflects how users gain insights from and motivation about their activity from the apps themselves.[17] Other scholars have characterized self-tracking as performing for the system, or controlling what is (or isn’t) recorded, performing for the self, tracking themselves to gain insight into their behavior, and performing for other people, or the importance of how other people viewed the person being tracked, as well as the control the person being tracked had over their data and thus how they are perceived.[18]

Cookies, flash cookies, and web beacons[edit]

Additionally, privacy concerns surround cookies, flash cookies, and web beacons on websites today.[18] Ultimately, five main concerns surround the usage of cookies, flash cookies, and web beacons, according to a study:[5]

  • Firstly, the authors note that users lack anonymity online, with cookies utilizing unique identifiers and flash cookies enabling recognition of website visits[5]
  • Another concern the authors note is unintended uses of cookies, since cookies were initially designed to benefit the user’s experience and engagement online, but have since morphed into a business run by advertisers in which personal data is sold for profit[5]
  • Users are likely unaware of how their personal information is being used, reflecting the surreptitious nature of data collection[5]
  • Some cookies trespass into the web users’ own resources and are downloaded to the user’s computer often without the user’s awareness[5]
  • Lastly, the authors note that the threat of cookie sharing underscores how web users’ personal information can become combined with other data from websites and even a social security number to create a more cohesive picture of the user[5]

Data capitalism[edit]

Other scholars have defined a similarly extractive and destructive phenomenon called data capitalism.[19] Data capitalism is an economic system enabling the redistribution of power towards those who have access to the information––namely, big corporations.[19] There are three fundamental theories of how large companies engage users in virtual communities, reflecting the power of data capitalism on users today:

  • The free and open network: in making products free, large companies make their products more accessible to a larger audience from which they can extract valuable data in exchange.[19]
  • The connection between people and machines: data capitalism promotes a connection between people and machines which is derived from the user’s relationship to the technology itself.[19] Increasingly, tracking and surveillance technology is profiling users and learning their preferences, users become more comfortable with their devices and a self-fulfilling prophecy continues.[19]
  • The value placed on data: new information asymmetries are proliferating that exacerbate inequality of information and allow only the most powerful access to most people’s data.[19] Increasingly, a scholar suggests that the lack of transparency over users’ data reflects the tension between privacy and community online.[19]

Solutions[edit]

Scholars are convinced the current notice-and-consent model for privacy policies is fundamentally flawed because it assumes users intuitively understand all of the facts in a privacy policy, which is often not the case.[20] Instead, scholars emphasize the imperative role of creating a culture in which privacy becomes a social norm.[20] In effect, users of online technologies should identify the social activities they use on the internet and start questioning websites' governing norms as a natural outgrowth of their web browsing.[20] In effect, these norms need to prevent websites from collecting and sharing users' personal information.[20] In addition, starting with a user's personal values and seeing how these values correlate with online norms may be another way to assess whether or not privacy norms are being violated in odd cases.[20] Ultimately, scholars believe these privacy norms are vital to protecting both individuals and social institutions.[20]

Legal and ethical issues[edit]

While the United States lacks extensive privacy rights, the Fourth Amendment provides some privacy protections.[5] The Fourth Amendment states that “the right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated,” suggesting that while individuals are protected from all levels of the government, they are not legally protected from private companies or individuals with malicious intent.[5]

There are large implications for this technology within the legal field. Legally, The Federal Trade Commission has a responsibility to prevent deceptive practices by technology companies, such as those that could lead to consumer injury.[21] The FTC has made efforts to prevent invasive web tracking, tracking in physical space, malware, insecure and poorly designed services, and the use of deception to engage in surveillance.[21] For instance, in the realm of invasive web tracking, the FTC has brought lawsuits against companies who engage in ‘history sniffing’––a technique that enables companies to ascertain which links a user clicked on based on the color of the link.[21] Concerning tracking in physical space, the FTC has also cracked down on Nomi, a company that scans the MAC addresses of customers’ phones in stores.[21] MAC addresses function as a unique identifier, enabling the connection to wireless networks.[21] In the case of malware, the FTC has placed pressure on companies such as CyberSpy, a self-proclaimed email attachment company that claimed to secretly record users’ key presses.[21] The FTC has also cracked down on companies like Compete, a browser toolbar, because it decrypted users’ personal information on the internet, putting users at risk.[21] Lastly, in cases during which deception is used to engage in surveillance, the FTC has investigated private investigators, who surveil individuals on another person’s behalf.[21] In addition, audio beacon technology, used by an application called Silverpush, could violate the FTC’s policies because users were not made aware as to when the ultrasonic signals were being recorded.[21]

Another scholar believes that the convergence between lived experience and online technology is creating a term called Mixed Reality, in which people and things are replaced with virtual experiences.[22] Mixed Reality technologies can pose legal challenges in that laws which govern the online world will also extend to the real world.[22] In addition, data tagging––often through GPS, location-based services, or even near-field communication (NFC)––is the new technology at the heart of mixed reality, since people’s data is determined in part by their location.[22] Near-field communication enables devices to transmit data to each other with a certain range.[22] Virtual reality can become a privacy issue because it attempts to immerse users into the virtual environment by recording a user's every sensation.[22] In turn, mixed reality’s amalgamation with daily tasks suggest that it will be implicated in numerous legal issues ranging from copyright law to intellectual property law.[22] Customers are also being denied a voice in contracts, since only corporations set the rules by which individuals’ private information is mined and extracted.[22] The solution to these issues, according to scholars, are opt-in controls to police users’ privacy that enable balance to be restored to the law, particularly as it stands regarding contracts.[22]

Ethically, Zuboff points to the extraction, commodification, and analysis of private human experiences as well as increased surveillance––which is sometimes hidden––in everyday life as violating users' rights to privacy.[14] The usage of surreptitious methods, in which the user is unaware of the extent to which he or she is being tracked, brings tracking mechanisms––such as cookies, flash cookies, and web beacons––into the ethical realm as well since users are not being informed of this tracking perhaps as often as they should.[5]

The future[edit]

The rise of instrumentarian power––the power of companies to control, modify, and predict users' behaviors––as reflecting a new precedent for the future, as human freedom is stifled and limited by big corporations.[23]

See also[edit]

Week 11: Cross-device tracking[edit]

Cross-device tracking refers to technology which enables the tracking of users across multiple devices such as smartphones, television sets, smart TVs, and personal computers.[1]

More specifically, cross-device tracking is a technique in which technology companies and advertisers deploy trackers, often in the form of unique identifiers, cookies, or even ultrasonic signals, to generate a profile of users across multiple devices, not simply one.[2] For example, one such form of this tracking uses audio beacons, or inaudible sounds, emitted by one device and recognized through the microphone of the other device.[2]

This form of tracking is utilized primarily by technology companies and advertisers who use this information to piece together a cohesive profile of the user.[2] These profiles inform and predict the type of advertisements the user receives.[2]

Background[edit]

There are many ways in which online tracking has manifested itself. Historically, when companies wanted to track users’ online behavior, they simply had users sign in to their website.[1] This is a form of deterministic cross-device tracking, in which the user’s devices are associated with their account credentials, such as their email or username.[3] Consequently, while the user is logged in, the company can keep a running history of what sites the user has been to and which ads the user interacted with between computers and mobile devices.[3]

Eventually, cookies were deployed by advertisers, providing each user with a unique identifier in his or her browser so that the user’s preferences can be monitored.[4] This unique identifier informs the placement of relevant, targeted ads the user may receive.[4] Cookies were also used by companies to improve the user experience, enabling users to pick up where they left off on websites.[5] However, as users began utilizing multiple devices––up to around five––advertisers became confused as to how to track, manage, and consolidate this data across multiple devices as the cookie-based model suggested that each device––whether a phone, computer, or tablet––was a different person.[4]

Other technologies such as supercookies, which stay on computers long after the user deletes his or her cookies, and web beacons, which are unique images from a URL, are also used by trackers and advertisers to gain increased insight into users’ behavior.[4] However, advertisers were still limited in that only one device was able to be tracked and associated with a user.[4]

Thus, cross-device tracking initially emerged as a means of generating a profile of users across multiple devices, not simply one.

One such tactic for cross-device tracking is called browser fingerprinting, and occurs when browsers, which are modifiable to the users’ tastes, produce a unique signal that companies or advertisers can use to single out the user.[4] Browser fingerprinting has been a cause for concern because of its effectiveness and also since it does not allow for users to opt-out of the tracking.[4]

Another tactic used by Google is called AdID and works on smartphones in tandem with cookies on a user’s computer to track behavior across devices.[1]

Now, cross-device tracking has evolved into a new, radical form of surveillance technology which enables users to be tracked across multiple devices, including smartphones, TVs, and personal computers through the use of audio beacons, or inaudible sound, emitted by one device and recognized through the microphone of the other device, usually a smartphone.[2] In addition, cross-device tracking may presage the future of the Internet of Things (IoT), in which all types of devices––such as offices, cars, and homes––are seamlessly interconnected via the internet.[1]

Ultrasonic tracking[edit]

Humans interpret sound by picking up on different frequencies.[2] Given the variety of sound waves that exist, humans can only hear frequencies that are within a certain range––generally from 20Hz to 20kHz. Interestingly, by the age of 30, most humans cannot hear sounds above 18kHz.[2]

Ultrasound, which emits shorter wavelengths greater than or equal to 20kHz, enables the rapid transmission of data necessary for cross-device tracking to occur.[2]

Another integral component of cross-device tracking is the usage of audio beacons. Audio beacons are beacons that are embedded into ultrasound, so they cannot be heard by humans.[2] These audio beacons are used to surreptitiously track a user’s location and monitor online behavior by connecting with the microphone on another device without the user’s awareness.[2]

Applications[edit]

Studies have shown that 234 Android applications are eavesdropping on these ultrasonic channels without the user’s awareness.[2]

Applications such as Silverpush, Shopkick, and Lisnr are part of an “ultrasonic side-channel” in which the app, often unbeknownst to the user, intercepts ultrasonic signals emitted from the user’s environment, such as from a TV, to track which advertisements the user has heard and how long the person listened to them[2].

  • Silverpush­­––the leading company utilizing this technology––patented software enabling them to track TV ads based on audio stream above[2]
  • Shopkick, another popular application, gives discounts to users who shop at stores which emit these ultrasonic beacons, allowing them to create a profile of the user[2]
  • Lisnr utilizes a user’s location data in tandem with ultrasonic beacons to give users coupons related to their activities[2]

Another study suggested that Apple, Google, and Bluetooth Special Interest groups need to do more to prevent cross-device tracking.[6]

Privacy and surveillance concerns[edit]

Ultrasonic tracking technologies can pose massive threats to users’ privacy. There are four primary privacy concerns associated with this new form of tracking:

  • The first is media tracking: audio from the user’s television may be detected by the microphone in the user’s mobile device, allowing malicious actors to gain access to what the user is watching––particularly if it is salacious.[2] Advertisers can similarly gain insight into what a user typically watches.[2] In both scenarios, a user’s real-world behavior is linked to their online identity and used for tracking.[2]
  • Another form of tracking permitted by ultrasonic tracking is cross-device tracking, which enables a user’s profile to be connected across multiple devices based on proximity.[2] This form of tracking, in linking different devices, can help advertisers show more targeted ads or open individuals to attacks by malicious actors.[2]
  • Location tracking is yet another privacy concern.[2] Indeed, ultrasonic signals can convey location information via a location identifier, often placed in stores or businesses.[2]
  • Lastly, this new ultrasonic tracking poses a threat to users of Bitcoin and Tor because it deanonymizes users’ information, since ultrasonic signals associate the user’s mobile phone with the Bitcoin or Tor account.[2]

From cookies to ultrasonic trackers, some argue that invasive forms of surveillance underscore how users are trapped in a digital panopticon, similar to the concept envisioned by Jeremy Bentham: a prison in which the prisoners were able to be seen at all times by guards but were unable to detect when, or even if, they were being watched at all, creating a sense of paranoia that drove prisoners to carefully police their own behavior.[7] Similarly, scholars have drawn parallels between Bentham’s panopticon and today’s pervasive use of internet tracking in that individuals lack awareness to the vast disparities of power that exist between themselves and the corporation to which they willingly give their data.[7] In essence, companies are able to gain access to consumers’ activity when they use a company’s services.[7] The usage of these services often is beneficial, which is why users agree to exchange personal information.[7] However, since users participate in this unequal environment, in which corporations hold most of the power and in which the user is obliged to accept the bad faith offers made by the corporations, users are operating in an environment that ultimately controls, shapes and molds them to think and behave in a certain way, depriving them of privacy.[7]

In direct response to the panoptic and invasive forms of tracking manifesting themselves within the digital realm, some have turned to sousveillance: a form of inverse surveillance in which users can record those who are surveilling them, thereby empowering themselves.[8] This form of counter surveillance, often used through small wearable recording devices, enables the subversion of corporate and government panoptic surveillance by holding those in power accountable and giving people a voice––a permanent video record––to push back against government abuses of power or malicious behavior that may go unchecked.[8]

The television, along with the remote control, is also argued to be conditioning humans into habitually repeating that which they enjoy without experiencing genuine surprise or even discomfort, a critique of the television similar to that of those made against information silos on social media sites today.[9] In essence, this technological development led to egocasting: a world in which people exert extreme amounts of control over what they watch and hear.[9] As a result, users deliberately avoid content they disagree with in any form––ideas, sounds, or images.[9] In turn, this siloing can drive political polarization and stoke tribalism.[9] Plus, companies like TiVO analyze how TV show watchers use their remote and DVR capability to skip over programming, such as advertisements––a privacy concern users may lack awareness of as well.[9]

Some scholars have even contended that in an age of increased surveillance, users now participate online through the active generation and curation of online images––a form of control.[10] In so doing, users can be seen as rejecting the shame associated with their private lives.[10] Other scholars note that surveillance is fundamentally dependent upon location in both physical and virtual environments.[11] This form of surveillance can be seen in travel websites which enable the user to share their vacation to a virtual audience.[11] The person’s willingness to share their personal information online is validated by the audience, since the audience holds the user accountable and the user vicariously experiences pleasure through the audience.[11] Further, users' mobile data is increasingly being shared to third parties online, potentially underscoring the regulatory challenges inherent in protecting users' online privacy.[12]

In addition, scholars argue that users have the right to know the value of their personal data.[13] Increasingly, users’ digital identity is becoming commodified through the selling and monetizing of their personal data for profit by large companies.[13] Unfortunately, many people appear to be unaware of the fact that their data holds monetary value that can potentially be used towards other products and services.[13] Thus, scholars are arguing for users’ to have increased awareness and transparency into this process so that users can become empowered and informed consumers of data.[13]

Surveillance capitalism[edit]

The increased usage of cross-device tracking by advertisers is indicative of the rise of a new era of data extraction and analysis as a form of profit, or surveillance capitalism, a term coined by Shoshana Zuboff.[14] This form of capitalism seeks to commodify private human experience to create behavioral futures markets, in which behavior is predicted and behavioral data is harvested from the user.[14] Zuboff suggests that this new era of surveillance capitalism eclipses Bentham's panopticon, becoming far more encroaching and invasive as, unlike a prison, there is no escape, and the thoughts, feelings, and actions of users are immediately extracted to be commodified and resold.[14] Thus, since cross-device tracking seeks to create a profile of a user across multiple devices, big tech companies, such as Google, could use this behavioral data to make predictions about the user’s future behavior without the user’s awareness.[14]

Scholars are beginning to discuss the possibility of quantifying the monetary value of users’ personal data. Notably, the algorithms used to extract and mine user data are increasingly seen as business assets and thus protected via trade secrets.[13] Indeed, the usage of free online services, such as public Wi-Fi, often comes at the unknown cost to the user of being tracked and profiled by the company providing the service.[13] In essence, a transaction is occurring: users’ personal data is being exchanged for access to a free service.[13] Increasingly, scholars are advocating for users’ right to understand the fundamental value of their personal data more intimately so as to be more savvy, informed consumers who have the ability to protect the privacy of their online information and not be manipulated into unwittingly giving away personal information.[13]

In addition, health and wellness applications also have a dearth of privacy protections as well: a study found that many health apps lacked encryption and that regulators should enforce stronger data privacy protections.[15] The study stated that of the 79 apps they tested, none of the applications locally encrypted the users’ personal information and 89% of the applications pushed the data online.[15] The lack of adequate privacy and security measures surrounding users’ personal medical data on mobile applications underscores the lessening degree to which users can trust mobile app developers to safeguard their personal information online.[15] While mobile application developers continue to confront privacy and security concerns, users are increasingly looking to ways to visualize their data through wearable devices and applications that track their workout and exercise routines.[16] Indeed, researchers discovered that these self-tracking devices play a role as a tool, a toy, and a tutor in users’ lives.[17] In the tool role, the self-tracking device functions as a mechanism to help the user in some capacity, often to achieve personal health goals.[17] The toy role underscores how some self-tracking users see it as a fun game, particularly with regard to rewards and viewing the visualized data.[17] Lastly, the tutor role reflects how users gain insights from and motivation about their activity from the apps themselves.[17] Other scholars have characterized self-tracking as performing for the system, or controlling what is (or isn’t) recorded, performing for the self, tracking themselves to gain insight into their behavior, and performing for other people, or the importance of how other people viewed the person being tracked, as well as the control the person being tracked had over their data and thus how they are perceived.[18]

Additionally, privacy concerns surround cookies, flash cookies, and web beacons on websites today.[18] Ultimately, five main concerns surround the usage of cookies, flash cookies, and web beacons, according to a study:[5]

  • Firstly, the authors note that users lack anonymity online, with cookies utilizing unique identifiers and flash cookies enabling recognition of website visits[5]
  • Another concern the authors note is unintended uses of cookies, since cookies were initially designed to benefit the user’s experience and engagement online, but have since morphed into a business run by advertisers in which personal data is sold for profit[5]
  • Users are likely unaware of how their personal information is being used, reflecting the surreptitious nature of data collection[5]
  • Some cookies trespass into the web users’ own resources and are downloaded to the user’s computer often without the user’s awareness[5]
  • Lastly, the authors note that the threat of cookie sharing underscores how web users’ personal information can become combined with other data from websites and even a social security number to create a more cohesive picture of the user[5]

Data capitalism[edit]

Other scholars have defined a similarly extractive and destructive phenomenon called data capitalism.[19] Data capitalism is an economic system enabling the redistribution of power towards those who have access to the information––namely, big corporations.[19] There are three fundamental theories of how large companies engage users in virtual communities, reflecting the power of data capitalism on users today:

  • The free and open network: in making products free, large companies make their products more accessible to a larger audience from which they can extract valuable data in exchange.[19]
  • The connection between people and machines: data capitalism promotes a connection between people and machines which is derived from the user’s relationship to the technology itself.[19] Increasingly, tracking and surveillance technology is profiling users and learning their preferences, users become more comfortable with their devices and a self-fulfilling prophecy continues.[19]
  • The value placed on data: new information asymmetries are proliferating that exacerbate inequality of information and allow only the most powerful access to most people’s data.[19] Increasingly, a scholar suggests that the lack of transparency over users’ data reflects the tension between privacy and community online.[19]

Solutions[edit]

Scholars are convinced the current notice-and-consent model for privacy policies is fundamentally flawed because it assumes users intuitively understand all of the facts in a privacy policy, which is often not the case.[20] Instead, scholars emphasize the imperative role of creating a culture in which privacy becomes a social norm.[20] In effect, users of online technologies should identify the social activities they use on the internet and start questioning websites' governing norms as a natural outgrowth of their web browsing.[20] In effect, these norms need to prevent websites from collecting and sharing users' personal information.[20] In addition, starting with a user's personal values and seeing how these values correlate with online norms may be another way to assess whether or not privacy norms are being violated in odd cases.[20] Ultimately, scholars believe these privacy norms are vital to protecting both individuals and social institutions.[20]

Legal and ethical issues[edit]

While the United States lacks extensive privacy rights, the Fourth Amendment provides some privacy protections.[5] The Fourth Amendment states that “the right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated,” suggesting that while individuals are protected from all levels of the government, they are not legally protected from private companies or individuals with malicious intent.[5]

There are large implications for this technology within the legal field. Legally, The Federal Trade Commission has a responsibility to prevent deceptive practices by technology companies, such as those that could lead to consumer injury.[21] The FTC has made efforts to prevent invasive web tracking, tracking in physical space, malware, insecure and poorly designed services, and the use of deception to engage in surveillance.[21] For instance, in the realm of invasive web tracking, the FTC has brought lawsuits against companies who engage in ‘history sniffing’––a technique that enables companies to ascertain which links a user clicked on based on the color of the link.[21] Concerning tracking in physical space, the FTC has also cracked down on Nomi, a company that scans the MAC addresses of customers’ phones in stores.[21] MAC addresses function as a unique identifier, enabling the connection to wireless networks.[21] In the case of malware, the FTC has placed pressure on companies such as CyberSpy, a self-proclaimed email attachment company that claimed to secretly record users’ key presses.[21] The FTC has also cracked down on companies like Compete, a browser toolbar, because it decrypted users’ personal information on the internet, putting users at risk.[21] Lastly, in cases during which deception is used to engage in surveillance, the FTC has investigated private investigators, who surveil individuals on another person’s behalf.[21] In addition, audio beacon technology, used by an application called Silverpush, could violate the FTC’s policies because users were not made aware as to when the ultrasonic signals were being recorded.[21]

Another scholar believes that the convergence between lived experience and online technology is creating a term called Mixed Reality, in which people and things are replaced with virtual experiences.[22] Mixed Reality technologies can pose legal challenges in that laws which govern the online world will also extend to the real world.[22] In addition, data tagging––often through GPS, location-based services, or even near-field communication (NFC)––is the new technology at the heart of mixed reality, since people’s data is determined in part by their location.[22] Near-field communication enables devices to transmit data to each other with a certain range.[22] Virtual reality can become a privacy issue because it attempts to immerse users into the virtual environment by recording a user's every sensation.[22] In turn, mixed reality’s amalgamation with daily tasks suggest that it will be implicated in numerous legal issues ranging from copyright law to intellectual property law.[22] Customers are also being denied a voice in contracts, since only corporations set the rules by which individuals’ private information is mined and extracted.[22] The solution to these issues, according to scholars, are opt-in controls to police users’ privacy that enable balance to be restored to the law, particularly as it stands regarding contracts.[22]

Ethically, Zuboff points to the extraction, commodification, and analysis of private human experiences as well as increased surveillance––which is sometimes hidden––in everyday life as violating users' rights to privacy.[14] The usage of surreptitious methods, in which the user is unaware of the extent to which he or she is being tracked, brings tracking mechanisms––such as cookies, flash cookies, and web beacons––into the ethical realm as well since users are not being informed of this tracking perhaps as often as they should.[5]

The future[edit]

The rise of instrumentarian power––the power of companies to control, modify, and predict users' behaviors––as reflecting a new precedent for the future, as human freedom is stifled and limited by big corporations.[23]

See also[edit]


Week 10: Cross-device tracking[edit]

Cross-device tracking refers to technology which enables the tracking of users across multiple devices such as smartphones, television sets, smart TVs, and personal computers.[1]

More specifically, cross-device tracking is a technique in which technology companies and advertisers deploy trackers, often in the form of unique identifiers, cookies, or even ultrasonic signals, to generate a profile of users across multiple devices, not simply one.[2] For example, one such form of this tracking uses audio beacons, or inaudible sounds, emitted by one device and recognized through the microphone of the other device.[2]

This form of tracking is utilized primarily by technology companies and advertisers, who use this information to piece together a cohesive profile of the user, which informs and predicts the type of advertisements the user receives.[2]

Background[edit]

There are many ways in which online tracking has manifested itself. Historically, when companies wanted to track users’ online behavior, they simply had users sign in to their website.[1] This is a form of deterministic cross-device tracking, in which the user’s devices are associated with their account credentials, such as their email or username.[3] Consequently, while the user is logged in, the company can keep a running history of what sites the user has been to and which ads the user interacted with between computers and mobile devices.[3]

Eventually, cookies were deployed by advertisers, providing each user with a unique identifier in his or her browser so that the user’s preferences can be monitored.[4] This unique identifier informs the placement of relevant, targeted ads the user may receive.[4] Cookies were also used by companies to improve the user experience, enabling users to pick up where they left off on websites.[5] However, as users began utilizing multiple devices––up to around five––advertisers became confused as to how to track, manage, and consolidate this data across multiple devices as the cookie-based model suggested that each device––whether a phone, computer, or tablet––was a different person.[4]

Other technologies such as supercookies, which stay on computers long after the user deletes his or her cookies, and web beacons, which are unique images from a URL, are also used by trackers and advertisers to gain increased insight into users’ behavior.[4] However, advertisers were still limited in that only one device was able to be tracked and associated with a user.[4]

Thus, cross-device tracking initially emerged as a means of generating a profile of users across multiple devices, not simply one.

One such tactic for cross-device tracking is called browser fingerprinting, and occurs when browsers, which are modifiable to the users’ tastes, produce a unique signal that companies or advertisers can use to single out the user.[4] Browser fingerprinting has been a cause for concern because of its effectiveness and also since it does not allow for users to opt-out of the tracking.[4]

Another tactic used by Google is called AdID and works on smartphones in tandem with cookies on a user’s computer to track behavior across devices.[1]

Now, cross-device tracking has evolved into a new, radical form of surveillance technology which enables users to be tracked across multiple devices, including smartphones, TVs, and personal computers through the use of audio beacons, or inaudible sound, emitted by one device and recognized through the microphone of the other device, usually a smartphone.[2] In addition, cross-device tracking may presage the future of the Internet of Things (IoT), in which all types of devices––such as offices, cars, and homes––are seamlessly interconnected via the internet.[1]

Ultrasonic tracking[edit]

Humans interpret sound by picking up on different frequencies.[2] Given the variety of sound waves that exist, humans can only hear frequencies that are within a certain range––generally from 20Hz to 20kHz. Interestingly, by the age of 30, most humans cannot hear sounds above 18kHz.[2]

Ultrasound, which emits shorter wavelengths greater than or equal to 20kHz, enables the rapid transmission of data necessary for cross-device tracking to occur.[2]

Another integral component of cross-device tracking is the usage of audio beacons. Audio beacons are beacons that are embedded into ultrasound, so they cannot be heard by humans.[2] These audio beacons are used to surreptitiously track a user’s location and monitor online behavior by connecting with the microphone on another device without the user’s awareness.[2]

Applications[edit]

Studies have shown that 234 Android applications are eavesdropping on these ultrasonic channels without the user’s awareness.[2]

Applications such as Silverpush, Shopkick, and Lisnr are part of an “ultrasonic side-channel” in which the app, often unbeknownst to the user, intercepts ultrasonic signals emitted from the user’s environment, such as from a TV, to track which advertisements the user has heard and how long the person listened to them[2].

  • Silverpush­­––the leading company utilizing this technology––patented software enabling them to track TV ads based on audio stream above[2]
  • Shopkick, another popular application, gives discounts to users who shop at stores which emit these ultrasonic beacons, allowing them to create a profile of the user[2]
  • Lisnr utilizes a user’s location data in tandem with ultrasonic beacons to give users coupons related to their activities[2]

Another study suggested that Apple, Google, and Bluetooth Special Interest groups need to do more to prevent cross-device tracking.[6]

Privacy and surveillance concerns with cross-device tracking[edit]

Ultrasonic tracking technologies can pose massive threats to users’ privacy. There are four primary privacy concerns associated with this new form of tracking:

  • The first is media tracking: audio from the user’s television may be detected by the microphone in the user’s mobile device, allowing malicious actors to gain access to what the user is watching––particularly if it is salacious.[2] Advertisers can similarly gain insight into what a user typically watches.[2] In both scenarios, a user’s real-world behavior is linked to their online identity and used for tracking.[2]
  • Another form of tracking permitted by ultrasonic tracking is cross-device tracking, which enables a user’s profile to be connected across multiple devices based on proximity.[2] This form of tracking, in linking different devices, can help advertisers show more targeted ads or open individuals to attacks by malicious actors.[2]
  • Location tracking is yet another privacy concern.[2] Indeed, ultrasonic signals can convey location information via a location identifier, often placed in stores or businesses.[2]
  • Lastly, this new ultrasonic tracking poses a threat to users of Bitcoin and Tor because it deanonymizes users’ information, since ultrasonic signals associate the user’s mobile phone with the Bitcoin or Tor account.[2]

From cookies to ultrasonic trackers, some argue that invasive forms of surveillance underscore how users are trapped in a digital panopticon, similar to the concept envisioned by Jeremy Bentham: a prison in which the prisoners were able to be seen at all times by guards but were unable to detect when, or even if, they were being watched at all, creating a sense of paranoia that drove prisoners to carefully police their own behavior.[7] Similarly, scholars have drawn parallels between Bentham’s panopticon and today’s pervasive use of internet tracking in that individuals lack awareness to the vast disparities of power that exist between themselves and the corporation to which they willingly give their data.[7] In essence, companies are able to gain access to consumers’ activity when they use a company’s services.[7] The usage of these services often is beneficial, which is why users agree to exchange personal information.[7] However, since users participate in this unequal environment, in which corporations hold most of the power and in which the user is obliged to accept the bad faith offers made by the corporations, users are operating in an environment that ultimately controls, shapes and molds them to think and behave in a certain way, depriving them of privacy.[7]

In direct response to the panoptic and invasive forms of tracking manifesting themselves within the digital realm, some have turned to sousveillance: a form of inverse surveillance in which users can record those who are surveilling them, thereby empowering themselves.[8] This form of counter surveillance, often used through small wearable recording devices, enables the subversion of corporate and government panoptic surveillance by holding those in power accountable and giving people a voice––a permanent video record––to push back against government abuses of power or malicious behavior that may go unchecked.[8]

The television, along with the remote control, is also argued to be conditioning humans into habitually repeating activities they enjoy without experiencing genuine surprise or even discomfort, a critique of the television similar to that of those made against information silos on social media sites today.[9] In essence, this technological development led to egocasting: a world in which people exert extreme amounts of control over what they watch and hear.[9] As a result, users deliberately avoid content they disagree with in any form––ideas, sounds, or images.[9] In turn, this siloing can drive political polarization and stoke tribalism.[9] Plus, companies like TiVO analyze how TV show watchers use their remote and DVR capability to skip over programming, such as advertisements––a privacy concern users may lack awareness of as well.[9]

Some scholars have even contended that in an age of increased surveillance, users now participate online through the active generation and curation of online images––a form of control.[10] In so doing, users can be seen as rejecting the shame associated with their private lives.[10] Other scholars note that surveillance is fundamentally dependent upon location in both physical and virtual environments.[11] This form of surveillance can be seen in travel websites which enable the user to share their vacation to a virtual audience.[11] The person’s willingness to share their personal information online is validated by the audience, since the audience holds the user accountable and the user vicariously experiences pleasure through the audience.[11] Further, users' mobile data is increasingly being shared to third parties online, potentially underscoring the regulatory challenges inherent in protecting users' online privacy.[12]

In addition, scholars argue that users have the right to know the value of their personal data.[13] Increasingly, users’ digital identity is becoming commodified through the selling and monetizing of their personal data for profit by large companies.[13] Unfortunately, many people appear to be unaware of the fact that their data holds monetary value that can potentially be used towards other products and services.[13] Thus, scholars are arguing for users’ to have increased awareness and transparency into this process so that users can become empowered and informed consumers of data.[13]

Surveillance capitalism[edit]

The increased usage of cross-device tracking by advertisers is indicative of the rise of a new era of data extraction and analysis as a form of profit, or surveillance capitalism, a term coined by Shoshana Zuboff.[14] This form of capitalism seeks to commodify private human experience to create behavioral futures markets, in which behavior is predicted and behavioral data is harvested from the user.[14] Zuboff suggests that this new era of surveillance capitalism eclipses Bentham's panopticon, becoming far more encroaching and invasive as, unlike a prison, there is no escape, and the thoughts, feelings, and actions of users are immediately extracted to be commodified and resold.[14] Thus, since cross-device tracking seeks to create a profile of a user across multiple devices, big tech companies, such as Google, could use this behavioral data to make predictions about the user’s future behavior without the user’s awareness.[14]

Scholars are beginning to discuss the possibility of quantifying the monetary value of users’ personal data. Notably, the algorithms used to extract and mine user data are increasingly seen as business assets and thus protected via trade secrets.[13] Indeed, the usage of free online services, such as public Wi-Fi, often comes at the unknown cost to the user of being tracked and profiled by the company providing the service.[13] In essence, a transaction is occurring: users’ personal data is being exchanged for access to a free service.[13] Increasingly, scholars are advocating for users’ right to understand the fundamental value of their personal data more intimately so as to be more savvy, informed consumers who have the ability to protect the privacy of their online information and not be manipulated into unwittingly giving away personal information.[13]

In addition, health and wellness applications also have a dearth of privacy protections as well: a study found that many health apps lacked encryption and that regulators should enforce stronger data privacy protections.[15] The study stated that of the 79 apps they tested, none of the applications locally encrypted the users’ personal information and 89% of the applications pushed the data online.[15] The lack of adequate privacy and security measures surrounding users’ personal medical data on mobile applications underscores the lessening degree to which users can trust mobile app developers to safeguard their personal information online.[15] While mobile application developers continue to confront privacy and security concerns, users are increasingly looking to ways to visualize their data through wearable devices and applications that track their workout and exercise routines.[16] Indeed, researchers discovered that these self-tracking devices play a role as a tool, a toy, and a tutor in users’ lives.[17] In the tool role, the self-tracking device functions as a mechanism to help the user in some capacity, often to achieve personal health goals.[17] The toy role underscores how some self-tracking users see it as a fun game, particularly with regard to rewards and viewing the visualized data.[17] Lastly, the tutor role reflects how users gain insights from and motivation about their activity from the apps themselves.[17] Other scholars have characterized self-tracking as performing for the system, or controlling what is (or isn’t) recorded, performing for the self, tracking themselves to gain insight into their behavior, and performing for other people, or the importance of how other people viewed the person being tracked, as well as the control the person being tracked had over their data and thus how they are perceived.[18]

Additionally, privacy concerns surround cookies, flash cookies, and web beacons on websites today.[18] Ultimately, five main concerns surround the usage of cookies, flash cookies, and web beacons, according to a study:[5]

  • Firstly, the authors note that users lack anonymity online, with cookies utilizing unique identifiers and flash cookies enabling recognition of website visits[5]
  • Another concern the authors note is unintended uses of cookies, since cookies were initially designed to benefit the user’s experience and engagement online, but have since morphed into a business run by advertisers in which personal data is sold for profit[5]
  • Users are likely unaware of how their personal information is being used, reflecting the surreptitious nature of data collection[5]
  • Some cookies trespass into the web users’ own resources and are downloaded to the user’s computer often without the user’s awareness[5]
  • Lastly, the authors note that the threat of cookie sharing underscores how web users’ personal information can become combined with other data from websites and even a social security number to create a more cohesive picture of the user[5]

Data capitalism[edit]

Other scholars have defined a similarly extractive and destructive phenomenon called data capitalism.[19] Data capitalism is an economic system enabling the redistribution of power towards those who have access to the information––namely, big corporations.[19] There are three fundamental theories of how large companies engage users in virtual communities, reflecting the power of data capitalism on users today:

  • The free and open network: in making products free, large companies make their products more accessible to a larger audience from which they can extract valuable data in exchange.[19]
  • The connection between people and machines: data capitalism promotes a connection between people and machines which is derived from the user’s relationship to the technology itself.[19] Increasingly, tracking and surveillance technology is profiling users and learning their preferences, users become more comfortable with their devices and a self-fulfilling prophecy continues.[19]
  • The value placed on data: new information asymmetries are proliferating that exacerbate inequality of information and allow only the most powerful access to most people’s data.[19] Increasingly, a scholar suggests that the lack of transparency over users’ data reflects the tension between privacy and community online.[19]

Solutions[edit]

Scholars are convinced the current notice-and-consent model for privacy policies is fundamentally flawed because it assumes users intuitively understand all of the facts in a privacy policy, which is often not the case.[20] Instead, scholars emphasize the imperative role of creating a culture in which privacy becomes a social norm.[20] In effect, users of online technologies should identify the social activities they use on the internet and start questioning websites' governing norms as a natural outgrowth of their web browsing.[20] In effect, these norms need to prevent websites from collecting and sharing users' personal information.[20] In addition, starting with a user's personal values and seeing how these values correlate with online norms may be another way to assess whether or not privacy norms are being violated in odd cases.[20] Ultimately, scholars believe these privacy norms are vital to protecting both individuals and social institutions.[20]

Legal and ethical issues[edit]

While the United States lacks extensive privacy rights, the Fourth Amendment provides some privacy protections.[5] The Fourth Amendment states that “the right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated,” suggesting that while individuals are protected from all levels of the government, they are not legally protected from private companies or individuals with malicious intent.[5]

There are large implications for this technology within the legal field. Legally, The Federal Trade Commission has a responsibility to prevent deceptive practices by technology companies, such as those that could lead to consumer injury.[21] The FTC has made efforts to prevent invasive web tracking, tracking in physical space, malware, insecure and poorly designed services, and the use of deception to engage in surveillance.[21] For instance, in the realm of invasive web tracking, the FTC has brought lawsuits against companies who engage in ‘history sniffing’––a technique that enables companies to ascertain which links a user clicked on based on the color of the link.[21] Concerning tracking in physical space, the FTC has also cracked down on Nomi, a company that scans the MAC addresses of customers’ phones in stores.[21] MAC addresses function as a unique identifier, enabling the connection to wireless networks.[21] In the case of malware, the FTC has placed pressure on companies such as CyberSpy, a self-proclaimed email attachment company that claimed to secretly record users’ key presses.[21] The FTC has also cracked down on companies like Compete, a browser toolbar, because it decrypted users’ personal information on the internet, putting users at risk.[21] Lastly, in cases during which deception is used to engage in surveillance, the FTC has investigated private investigators, who surveil individuals on another person’s behalf.[21] In addition, audio beacon technology, used by an application called Silverpush, could violate the FTC’s policies because users were not made aware as to when the ultrasonic signals were being recorded.[21]

Another scholar believes that the convergence between real, lived experience and online technology is creating a term called Mixed Reality, in which people and things are replaced with virtual experiences.[22] Mixed Reality technologies can pose legal challenges in that laws which govern the online world will also extend to the real world.[22] In addition, data tagging––often through GPS, location-based services, or even near field communication (NFC)––is the new technology at the heart of mixed reality, since people’s data is determined in part by their location.[22] Virtual reality can become a privacy issue because it attempts to immerse users into the virtual environment by recording a user's every sensation.[22] In turn, mixed reality’s amalgamation with daily tasks suggest that it will be implicated in numerous legal issues ranging from copyright law to intellectual property law.[22] Customers are also being denied a voice in contracts, since only corporations set the rules by which individuals’ private information is mined and extracted.[22] The solution to these issues, according to scholars, are opt-in controls to police users’ privacy that enable balance to be restored to the law, particularly as it stands regarding contracts.[22]

Ethically, Zuboff points to the extraction, commodification, and analysis of private human experiences as well as increased surveillance––which is sometimes hidden––in everyday life as violating users' rights to privacy.[14] The usage of surreptitious methods, in which the user is unaware of the extent to which he or she is being tracked, brings tracking mechanisms––such as cookies, flash cookies, and web beacons––into the ethical realm as well since users are not being informed of this tracking perhaps as often as they should.[5]

The future[edit]

Zuboff sees the rise of instrumentarian power––the power of companies to control, modify, and predict users' behaviors––as reflecting a new precedent for the future, as human freedom is stifled and limited by big corporations.[23]

See also[edit]


Week 9: Cross-device tracking[edit]

Cross-device tracking refers to technology which enables the tracking of users across multiple devices such as smartphones, television sets, smart TVs, and personal computers.[1]

More specifically, cross device tracking is a technique in which technology companies and advertisers deploy trackers, often in the form of unique identifiers, cookies, or even ultrasonic signals, to generate a profile of users across multiple devices, not simply one.[2]. For example, one such form of this tracking uses audio beacons, or inaudible sounds, emitted by one device and recognized through the microphone of the other device.[2]

This form of tracking is utilized primarily by technology companies and advertisers, who use this information to piece together a cohesive profile of the user, which informs and predicts the type of advertisements the user receives.[2]

Background[edit]

There are many ways in which online tracking has manifested itself. Historically, when companies wanted to track users’ online behavior, they simply had users sign in to their website.[1] This is a form of deterministic cross-device tracking, in which the user’s devices are associated with their account credentials, such as their email or username.[3] Consequently, while the user is logged in, the company can keep a running history of what sites the user has been to and which ads the user interacted with between computers and mobile devices.[3]

Eventually, cookies were deployed by advertisers, providing each user with a unique identifier in his or her browser so that the user’s preferences can be monitored.[4] This unique identifier informs the placement of relevant, targeted ads the user may receive.[4] Cookies were also used by companies to improve the user experience, enabling users to pick up where they left off on websites.[5] However, as users began utilizing multiple devices––up to around five––advertisers became confused as to how to track, manage, and consolidate this data across multiple devices as the cookie-based model suggested that each device––whether a phone, computer, or tablet––was a different person.[4]

Other technologies such as supercookies, which stay on computers long after the user deletes his or her cookies, and web beacons, which are unique images from a URL, are also used by trackers and advertisers to gain increased visibility into users’ behavior.[4] However, advertisers were still limited in that only one device was able to be tracked and associated with a user.[4]

Thus, cross-device tracking initially emerged as a means of generating a profile of users across multiple devices, not simply one.

One such tactic for cross-device tracking is called browser fingerprinting, and occurs when browsers, which are modifiable to the users’ tastes, produce a unique signal that companies or advertisers can use to single out the user.[4] Browser fingerprinting has been a cause for concern because of its effectiveness and also since it does not allow for users to opt-out of the tracking.[4]

Another tactic used by Google is called “AdID” and works on smartphones in tandem with cookies on a user’s computer to track behavior across devices.[1]

Now, cross-device tracking has evolved into a new, radical form of surveillance technology which enables users to be tracked across multiple devices, including smartphones, TVs, and personal computers through the use of audio beacons, or inaudible sound, emitted by one device and recognized through the microphone of the other device, usually a smartphone.[2] In addition, cross-device tracking may presage the future of the Internet of Things (IoT), in which all types of devices––such as offices, cars, and homes––are seamlessly interconnected via the internet.[1]

Ultrasonic tracking[edit]

Humans interpret sound by picking up on different frequencies.[2] Given the variety of sound waves that exist, humans can only hear frequencies that are within a certain range––generally from 20Hz to 20kHz. Interestingly, by the age of 30, most humans cannot hear sounds above 18kHz.[2]

Ultrasound, which emits shorter wavelengths greater than or equal to 20kHz, enables the rapid transmission of data necessary for cross-device tracking to occur.[2]

Another integral component of cross-device tracking is the usage of audio beacons. Audio beacons are beacons that are embedded into ultrasound, so they cannot be heard by humans.[2] These audio beacons are used to surreptitiously track a user’s location and monitor online behavior by connecting with the microphone on another device without the user’s awareness.[2]

Applications[edit]

Studies have shown that 234 Android applications are eavesdropping on these ultrasonic channels without the user’s awareness.[2]

Applications such as Silverpush, Shopkick, and Lisnr are part of an “ultrasonic side-channel” in which the app, often unbeknownst to the user, intercepts ultrasonic signals emitted from the user’s environment, such as from a TV, to track which advertisements the user has heard and how long the person listened to them[2].

  • Silverpush­­––the leading company utilizing this technology––patented software enabling them to track TV ads based on audio stream above[2]
  • Shopkick, another popular application, gives discounts to users who shop at stores which emit these ultrasonic beacons, allowing them to create a profile of the user[2]
  • Lisnr utilizes a user’s location data in tandem with ultrasonic beacons to give users coupons related to their activities[2]

Another study suggested that Apple, Google, and Bluetooth special interest groups need to do more to prevent cross-device tracking.[6]

Privacy and surveillance concerns with cross-device tracking[edit]

Ultrasonic tracking technologies can pose massive threats to users’ privacy. There are four primary privacy concerns associated with this new form of tracking:

  • The first is media tracking: audio from the user’s television may be detected by the microphone in the user’s mobile device, allowing malicious actors to gain access to what the user is watching––particularly if it is salacious.[2] Advertisers can similarly gain insight into what a user typically watches.[2] In both scenarios, a user’s real-world behavior is linked to their online identity and used for tracking.[2]
  • Another form of tracking permitted by ultrasonic tracking is cross-device tracking, which enables a user’s profile to be connected across multiple devices based on proximity.[2] This form of tracking, in linking different devices, can help advertisers show more targeted ads or open individuals to attacks by malicious actors.[2]
  • Location tracking is yet another privacy concern.[2] Indeed, ultrasonic signals can convey location information via a location identifier, often placed in stores or businesses.[2]
  • Lastly, this new ultrasonic tracking poses a threat to users of Bitcoin and Tor because it deanonymizes users’ information, since ultrasonic signals associate the user’s mobile phone with the Bitcoin or Tor account.[2]

From cookies to ultrasonic trackers, some argue that invasive forms of surveillance underscore how users are trapped in a digital panopticon, similar to the concept envisioned by Jeremy Bentham: a prison in which the prisoners were able to be seen at all times by guards but were unable to detect when, or even if, they were being watched at all, creating a sense of paranoia that drove prisoners to carefully police their own behavior.[7] Similarly, scholars have drawn parallels between Bentham’s panopticon and today’s pervasive use of internet tracking in that individuals lack awareness to the vast disparities of power that exist between themselves and the corporation to which they willingly give their data.[7] In essence, companies are able to gain access to consumers’ activity when they use a company’s services.[7] The usage of these services often is beneficial, which is why users agree to exchange personal information.[7] However, since users participate in this unequal environment, in which corporations hold most of the power and in which the user is obliged to accept the bad faith offers made by the corporations, users are operating in an environment that ultimately controls, shapes and molds them to think and behave in a certain way, depriving them of privacy.[7]

In direct response to the panoptic and invasive forms of tracking manifesting themselves within the digital realm, some have turned to sousveillance: a form of inverse surveillance in which users can record those who are surveilling them, thereby empowering themselves.[8] This form of counter surveillance, often used through small wearable recording devices, enables the subversion of corporate and government panoptic surveillance by holding those in power accountable and giving people a voice––a permanent video record––to push back against government abuses of power or malicious behavior that may go unchecked.[8]

Some scholars have even contended that in an age of increased surveillance, users now participate online through the active generation and curation of online images––a form of control.[10] In so doing, users can be seen as rejecting the shame associated with their private lives.[10] Other scholars note that surveillance is fundamentally dependent upon location in both physical and virtual environments.[11] This form of surveillance can be seen in travel websites which enable the user to share their vacation to a virtual audience.[11] The person’s willingness to share their personal information online is validated by the audience, since the audience holds the user accountable and the user vicariously experiences pleasure through the audience.[11] Further, researchers believe that users' mobile data is increasingly being shared to third parties online, potentially underscoring the regulatory challenges inherent in protecting users' online privacy.[12]

In addition, scholars argue that users have the right to know the value of their personal data.[13] Increasingly, users’ digital identity is becoming commodified through the selling and monetizing of their personal data for profit by large companies.[13] Unfortunately, many people appear to be unaware of the fact that their data holds monetary value that can potentially be used towards other products and services.[13] Thus, scholars are arguing for users’ to have increased awareness and transparency into this process so that users can become empowered and informed consumers of data.[13]

Surveillance Capitalism[edit]

The increased usage of cross-device tracking by advertisers is indicative of the rise of a new era of data extraction and analysis as a form of profit, or surveillance capitalism, a term coined by Shoshana Zuboff.[14] This form of capitalism seeks to commodify private human experience to create behavioral futures markets, in which behavior is predicted and behavioral data is harvested from the user.[14] Zuboff suggests that this new era of surveillance capitalism eclipses Bentham's panopticon, becoming far more encroaching and invasive as, unlike a prison, there is no escape, and the thoughts, feelings, and actions of users are immediately extracted to be commodified and resold.[14] Thus, since cross-device tracking seeks to create a profile of a user across multiple devices, big tech companies, such as Google, could use this behavioral data to make predictions about the user’s future behavior without the user’s awareness.[14]

Scholars are beginning to discuss the possibility of quantifying the monetary value of users’ personal data. Notably, the algorithms used to extract and mine user data are increasingly seen as business assets and thus protected via trade secrets.[24] Indeed, the usage of free online services, such as public Wi-Fi, often comes at the unknown cost to the user of being tracked and profiled by the company providing the service.[24] In essence, a transaction is occurring: users’ personal data is being exchanged for access to a free service.[24] Increasingly, scholars are advocating for users’ right to understand the fundamental value of their personal data more intimately so as to be more savvy, informed consumers who have the ability to protect the privacy of their online information and not be manipulated into unwittingly giving away personal information.[24]

In addition, health and wellness applications also have a dearth of privacy protections as well: a study found that many health apps lacked encryption and that regulators should enforce stronger data privacy protections.[15] The study stated that of the 79 apps they tested, none of the applications locally encrypted the users’ personal information and 89% of the applications pushed the data online.[15] The lack of adequate privacy and security measures surrounding users’ personal medical data on mobile applications underscores the degree to which users can trust mobile app developers to safeguard their personal information online.[15] While mobile application developers continue to confront privacy and security concerns, users are increasingly looking to ways to visualize their data through wearable devices and applications that track their workout and exercise routines.[16] Indeed, researchers discovered that these self-tracking devices play a role as a tool, a toy, and a tutor in users’ lives.[16] In the tool role, the self-tracking device functions as a mechanism to help the user in some capacity, often to achieve personal health goals.[16] The toy role underscores how some self-tracking users see it as a fun game, particularly with regard to rewards and viewing the visualized data.[16] Lastly, the tutor role reflects how users gain insights from and motivation about their activity from the apps themselves.[16] Other scholars have characterized self-tracking as performing for the system, or controlling what is (or isn’t) recorded, performing for the self, tracking themselves to gain insight into their behavior, and performing for other people, or the importance of how other people viewed the person being tracked, as well as the control the person being tracked had over their data and thus how they are perceived.[18]

Additionally, privacy concerns surround cookies, flash cookies, and web beacons on websites today.[18] Ultimately, five main concerns surround the usage of cookies, flash cookies, and web beacons, according to a study:[5]

  • Firstly, the authors note that users lack anonymity online, with cookies utilizing unique identifiers and flash cookies enabling recognition of website visits[5]
  • Another concern the authors note is unintended uses of cookies, since cookies were initially designed to benefit the user’s experience and engagement online, but have since morphed into a business run by advertisers in which personal data is sold for profit[5]
  • Users are likely unaware of how their personal information is being used, reflecting the surreptitious nature of data collection[5]
  • Some cookies trespass into the web users’ own resources and are downloaded to the user’s computer often without the user’s awareness[5]
  • Lastly, the authors note that the threat of cookie sharing underscores how web users’ personal information can become combined with other data from websites and even a social security number to create a more cohesive picture of the user[5]

Data Capitalism[edit]

Other scholars have defined a similarly extractive and destructive phenomenon called data capitalism.[19] Data capitalism is an economic system enabling the redistribution of power towards those who have access to the information––namely, big corporations.[19] There are three fundamental theories of how large companies engage users in virtual communities, reflecting the power of data capitalism on users today:

  • The free and open network: in making products free, large companies make their products more accessible to a larger audience from which they can extract valuable data in exchange.[19]
  • The connection between people and machines: data capitalism promotes a connection between people and machines which is derived from the user’s relationship to the technology itself.[19] Increasingly, tracking and surveillance technology is profiling users and learning their preferences, users become more comfortable with their devices and a self-fulfilling prophecy continues.[19]
  • The value placed on data: new information asymmetries are proliferating that exacerbate inequality of information and allow only the most powerful access to most people’s data.[19] Increasingly, a scholar suggests that the lack of transparency over users’ data reflects the tension between privacy and community online.[19]

Legal and ethical issues[edit]

While the United States lacks extensive privacy rights, the Fourth Amendment provides some privacy protections.[5] The Fourth Amendment states that “the right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated,” suggesting that while individuals are protected from all levels of the government, they are not legally protected from private companies or individuals with malicious intent.[5]

There are large implications for this technology within the legal field. Legally, The Federal Trade Commission has a responsibility to prevent deceptive practices by technology companies, such as those that could lead to consumer injury.[21] The FTC has made efforts to prevent invasive web tracking, tracking in physical space, malware, insecure and poorly designed services, and the use of deception to engage in surveillance.[21] For instance, in the realm of invasive web tracking, the FTC has brought lawsuits against companies who engage in ‘history sniffing’––a technique that enables companies to ascertain which links a user clicked on based on the color of the link.[21] Concerning tracking in physical space, the FTC has also cracked down on Nomi, a company that scans the MAC addresses of customers’ phones in stores.[21] MAC addresses function as a unique identifier, enabling the connection to wireless networks.[21] In the case of malware, the FTC has placed pressure on companies such as CyberSpy, a self-proclaimed email attachment company that claimed to secretly record users’ key presses.[21] The FTC has also cracked down on companies like Compete, a browser toolbar, because it decrypted users’ personal information on the internet, putting users at risk.[21] Lastly, in cases during which deception is used to engage in surveillance, the FTC has investigated private investigators, who surveil individuals on another person’s behalf.[21] In addition, audio beacon technology, used by an application called Silverpush, could violate the FTC’s policies because users were not made aware as to when the ultrasonic signals were being recorded.[21]

Ethically, Zuboff points to the extraction, commodification, and analysis of private human experiences as well as increased surveillance––which is sometimes hidden––in everyday life as violating users' rights to privacy.[14] The usage of surreptitious methods, in which the user is unaware of the extent to which he or she is being tracked, brings tracking mechanisms––such as cookies, flash cookies, and web beacons––into the ethical realm as well since users are not being informed of this tracking perhaps as often as they should.[5]

The future[edit]

Zuboff sees the rise of instrumentarian power––the power of companies to control, modify, and predict users' behaviors––as reflecting a new precedent for the future, as human freedom is stifled and limited by big corporations.[23]

See also[edit]

Week 8: Cross Device Tracking[edit]

Cross-device tracking refers to technology which enables the tracking of users across multiple devices such as smartphones, television sets, smart TVs, and personal computers.[1]

More specifically, cross-device tracking is a technique in which technology companies and advertisers deploy trackers, often in the form of unique identifiers, cookies, or even ultrasonic signals, to generate a profile of users across multiple devices, not simply one.[2]. For example, one such form of this tracking uses audio beacons, or inaudible sounds, emitted by one device and recognized through the microphone of the other device.[2]

This form of tracking is utilized primarily by technology companies and advertisers, who use this information to piece together a cohesive profile of the user, which informs and predicts the type of advertisements the user receives.[2]

Background[edit]

There are many ways in which online tracking has manifested itself. Historically, when companies wanted to track users’ online behavior, they simply had users sign in to their website.[1] This is a form of deterministic cross-device tracking, in which the user’s devices are associated with their account credentials, such as their email or username.[3] Consequently, while the user is logged in, the company can keep a running history of what sites the user has been to and which ads the user interacted with between computers and mobile devices.[3]

Eventually, cookies were deployed by advertisers, providing each user with a unique identifier in his or her browser so that the user’s preferences can be monitored.[4] This unique identifier informs the placement of relevant, targeted ads the user may receive.[4] However, as users began utilizing multiple devices––up to around five––advertisers became confused as to how to track, manage, and consolidate this data across multiple devices as the cookie-based model suggested that each device––whether a phone, computer, or tablet––was a different person.[4]

Other technologies such as supercookies, which stay on computers long after the user deletes his or her cookies, and web beacons, which are unique images from a URL, are also used by trackers and advertisers to gain increased visibility into users’ behavior.[4] However, advertisers were still limited in that only one device was able to be tracked and associated with a user.[4]

Thus, cross-device tracking initially emerged as a means of generating a profile of users across multiple devices, not simply one.

One such tactic for cross-device tracking is called browser fingerprinting, and occurs when browsers, which are modifiable to the users’ tastes, produce a unique signal that companies or advertisers can use to single out the user.[4] Browser fingerprinting has been a cause for concern because of its effectiveness and also since it does not allow for users to opt-out of the tracking.[4]

Another tactic used by Google is called “AdID” and works on smartphones in tandem with cookies on a user’s computer to track behavior across devices.[1]

Now, cross-device tracking has evolved into a new, radical form of surveillance technology which enables users to be tracked across multiple devices, including smartphones, TVs, and personal computers through the use of audio beacons, or inaudible sound, emitted by one device and recognized through the microphone of the other device, usually a smartphone.[2]

Humans interpret sound by picking up on different frequencies.[2] Given the variety of sound waves that exist, humans can only hear frequencies that are within a certain range––generally from 20Hz to 20kHz. Interestingly, by the age of 30, most humans cannot hear sounds above 18kHz.[2]

Ultrasound, which emits shorter wavelengths greater than or equal to 20kHz, enables the rapid transmission of data necessary for cross-device tracking to occur.[2]

Another integral component of cross-device tracking is the usage of audio beacons. Audio beacons are beacons that are embedded into ultrasound, so they cannot be heard by humans.[2] These audio beacons are used to surreptitiously track a user’s location and monitor online behavior by connecting with the microphone on another device without the user’s awareness.[2]

Applications[edit]

Studies have shown that 234 Android applications are eavesdropping on these ultrasonic channels without the user’s awareness.[2]

Applications such as Silverpush, Shopkick, and Lisnr are part of an “ultrasonic side-channel” in which the app, often unbeknownst to the user, intercepts ultrasonic signals emitted from the user’s environment, such as from a TV, to track which advertisements the user has heard and how long the person listened to them[2].

  • Silverpush­­––the leading company utilizing this technology––patented software enabling them to track TV ads based on audio stream above[2]
  • Shopkick, another popular application, gives discounts to users who shop at stores which emit these ultrasonic beacons, allowing them to create a profile of the user[2]
  • Lisnr utilizes a user’s location data in tandem with ultrasonic beacons to give users coupons related to their activities[2]

Another study suggested that Apple, Google, and Bluetooth Special Interest groups need to do more to prevent cross-device tracking.[6]

Privacy and surveillance concerns with cross-device tracking[edit]

Ultrasonic tracking technologies can pose massive threats to users’ privacy. There are four primary privacy concerns associated with this new form of tracking:

  • The first is media tracking: audio from the user’s television may be detected by the microphone in the user’s mobile device, allowing malicious actors to gain access to what the user is watching––particularly if it is salacious.[2] Advertisers can similarly gain insight into what a user typically watches.[2] In both scenarios, a user’s real-world behavior is linked to their online identity and used for tracking.[2]
  • Another form of tracking permitted by ultrasonic tracking is cross-device tracking, which enables a user’s profile to be connected across multiple devices based on proximity.[2] This form of tracking, in linking different devices, can help advertisers show more targeted ads or open individuals to attacks by malicious actors.[2]
  • Location tracking is yet another privacy concern highlighted by Arp’s study.[2] Indeed, ultrasonic signals can convey location information via a location identifier, often placed in stores or businesses.[2]
  • Lastly, this new ultrasonic tracking poses a threat to users of Bitcoin and Tor because it deanonymizes users’ information, since ultrasonic signals associate the user’s mobile phone with the Bitcoin or Tor account.[2]

From cookies to ultrasonic trackers, some argue that invasive forms of surveillance underscore how users are trapped in a digital panopticon, similar to the concept envisioned by Jeremy Bentham: a prison in which the prisoners were able to be seen at all times by guards but were unable to detect when, or even if, they were being watched at all, creating a sense of paranoia that drove prisoners to carefully police their own behavior.[7] Similarly, scholars have drawn parallels between Bentham’s panopticon and today’s pervasive use of internet tracking in that individuals lack awareness to the vast disparities of power that exist between themselves and the corporation to which they willingly give their data.[7] In essence, companies are able to gain access to consumers’ activity when they use a company’s services.[7] The usage of these services often is beneficial, which is why users agree to exchange personal information.[7] However, since users participate in this unequal environment, in which corporations hold most of the power and in which the user is obliged to accept the bad faith offers made by the corporations, users are operating in an environment that ultimately controls, shapes and molds them to think and behave in a certain way, depriving them of privacy.[7]

Surveillance Capitalism[edit]

The increased usage of cross-device tracking by advertisers is indicative of the rise of a new era of data extraction and analysis as a form of profit, or surveillance capitalism, a term coined by Shoshana Zuboff.[14] This form of capitalism seeks to commodify private human experience to create behavioral futures markets, in which behavior is predicted and behavioral data is harvested from the user.[14] Zuboff suggests that this new era of surveillance capitalism eclipses Bentham's panopticon, becoming far more encroaching and invasive as, unlike a prison, there is no escape, and the thoughts, feelings, and actions of users are immediately extracted to be commodified and resold.[14] Thus, since cross-device tracking seeks to create a profile of a user across multiple devices, big tech companies, such as Google, could use this behavioral data to make predictions about the user’s future behavior without the user’s awareness.[14]

Scholars are beginning to discuss the possibility of quantifying the monetary value of users’ personal data. Notably, the algorithms used to extract and mine user data are increasingly seen as business assets and thus protected via trade secrets.[24] Indeed, the usage of free online services, such as public Wi-Fi, often comes at the unknown cost to the user of being tracked and profiled by the company providing the service.[24] In essence, a transaction is occurring: users’ personal data is being exchanged for access to a free service.[24] Increasingly, scholars are advocating for users’ right to understand the fundamental value of their personal data more intimately so as to be more savvy, informed consumers who have the ability to protect the privacy of their online information and not be manipulated into unwittingly giving away personal information.[24]

In addition, health and wellness applications also have a dearth of privacy protections as well: a study found that many health apps lacked encryption and that regulators should enforce stronger data privacy protections.[15] The study stated that of the 79 apps they tested, none of the applications locally encrypted the users’ personal information and 89% of the applications pushed the data online.[15] The lack of adequate privacy and security measures surrounding users’ personal medical data on mobile applications underscores the degree to which users can trust mobile app developers to safeguard their personal information online.[15] While mobile application developers continue to confront privacy and security concerns, users are increasingly looking to ways to visualize their data through wearable devices and applications that track their workout and exercise routines.[16] Indeed, researchers discovered that these self-tracking devices play a role as a tool, a toy, and a tutor in users’ lives.[16] In the tool role, the self-tracking device functions as a mechanism to help the user in some capacity, often to achieve personal health goals.[16] The toy role underscores how some self-tracking users see it as a fun game, particularly with regard to rewards and viewing the visualized data.[16] Lastly, the tutor role reflects how users gain insights from and motivation about their activity from the apps themselves.[16] Other scholars have characterized self-tracking as performing for the system, or controlling what is (or isn’t) recorded, performing for the self, tracking themselves to gain insight into their behavior, and performing for other people, or the importance of how other people viewed the person being tracked, as well as the control the person being tracked had over their data and thus how they are perceived.[18]

Additionally, privacy concerns surround cookies, flash cookies, and web beacons on websites today.[18] Ultimately, five main concerns surround the usage of cookies, flash cookies, and web beacons, according to a study:[5]

  • Firstly, the authors note that users lack anonymity online, with cookies utilizing unique identifiers and flash cookies enabling recognition of website visits[5]
  • Another concern the authors note is unintended uses of cookies, since cookies were initially designed to benefit the user’s experience and engagement online, but have since morphed into a business run by advertisers in which personal data is sold for profit[5]
  • Users are likely unaware of how their personal information is being used, reflecting the surreptitious nature of data collection[5]
  • Some cookies trespass into the web users’ own resources and are downloaded to the user’s computer often without the user’s awareness[5]
  • Lastly, the authors note that the threat of cookie sharing underscores how web users’ personal information can become combined with other data from websites and even a social security number to create a more cohesive picture of the user[5]

Legal and ethical issues[edit]

While the United States lacks extensive privacy rights, the Fourth Amendment provides some privacy protections.[5] The Fourth Amendment states that “the right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated,” suggesting that while individuals are protected from all levels of the government, they are not legally protected from private companies or individuals with malicious intent.[5]

There are large implications for this technology within the legal field. Legally, The Federal Trade Commission has a responsibility to prevent deceptive practices by technology companies, such as those that could lead to consumer injury.[21] The FTC has made efforts to prevent invasive web tracking, tracking in physical space, malware, insecure and poorly designed services, and the use of deception to engage in surveillance.[21] For instance, in the realm of invasive web tracking, the FTC has brought lawsuits against companies who engage in ‘history sniffing’––a technique that enables companies to ascertain which links a user clicked on based on the color of the link.[21] Concerning tracking in physical space, the FTC has also cracked down on Nomi, a company that scans the MAC addresses of customers’ phones in stores.[21] MAC addresses function as a unique identifier, enabling the connection to wireless networks.[21] In the case of malware, the FTC has placed pressure on companies such as CyberSpy, a self-proclaimed email attachment company that claimed to secretly record users’ key presses.[21] The FTC has also cracked down on companies like Compete, a browser toolbar, because it decrypted users’ personal information on the internet, putting users at risk.[21] Lastly, in cases during which deception is used to engage in surveillance, the FTC has investigated private investigators, who surveil individuals on another person’s behalf.[21] In addition, audio beacon technology, used by an application called Silverpush, could violate the FTC’s policies because users were not made aware as to when the ultrasonic signals were being recorded.[21]

Ethically, Zuboff points to the extraction, commodification, and analysis of private human experiences as well as increased surveillance––which is sometimes hidden––in everyday life as violating users' rights to privacy.[14] The usage of surreptitious methods, in which the user is unaware of the extent to which he or she is being tracked, brings tracking mechanisms––such as cookies, flash cookies, and web beacons––into the ethical realm as well since users are not being informed of this tracking perhaps as often as they should.[5]

The future[edit]

Zuboff sees the rise of instrumentarian power––the power of companies to control, modify, and predict users' behaviors––as reflecting a new precedent for the future, as human freedom is stifled and limited by big corporations.[23]

See also[edit]

Week 11 Peer Review[edit]

Week 11 by King666Field

I think you create a great page overall! I enjoy reading your article and feel myself immersing new ideas and different viewpoints from it! You present multiple and diverse points of view around the topic and it really makes this article objective and comprehensive. I think grammar and sentence structure looks great and I do not find any during my reading. My only suggestions will be:

  1. The section of Privacy and Surveillance concerns contain so much words that people might be easy to get lost in it. You might try give a small bolded subtitle and use few words to summarize a particular kind of point of view in that paragraph so that readers might be able to refer to.
  2. The future section seems needed a lot of contents. I think it would not affect the whole page much if this section is deleted. If you have abundant content for this, that would be great too!

In all, very great page! Looking forwards to seeing this page on the mainspace!

Week 10 Peer Review[edit]

Week 10 - Peer Review by Travelqueen27

Great work this week! I enjoyed reading your article and really learned a lot by the end. The lead section is structured very well and it's concise in addressing what you will be covering throughout the article. Grammar and sentence structure looked great but I did notice some minor things that I have listed below with my suggestions. For "The Future" section I would personally not include because there is not a lot of content supporting that section and it is also based on the opinion of one scholar and not multiple scholars. Overall, very solid article and I look forward to the published product next week!

  • "The usage of these services often is beneficial, which is why users agree to exchange personal information" : would be great if you could provide a sentence or two on the benefits of exchanging personal information with companies
  • "Thus, cross-device tracking initially emerged as a means of generating a profile of users across multiple devices, not simply one." This sentence was used twice, in the lead section and the background. Maybe try rewording or just delete.
  • "Location tracking is yet another privacy concern highlighted by Arp’s study" This sentence is written in an essay-like form. I would delete "Arp's study" to make it more of an encyclopedic tone.
  • hyperlink "sousveillance"
  • Fix the format of some of citations, in the "Reference" section it gives you suggestions on how to fix.

Week 8 Peer Reviews[edit]

Week 8 - Peer review by QuixoticWindmills[edit]

I like this draft a lot!  It’s well written and things are clearly explained.  I was a little confused by the lead section, which appears to give two distinct definitions for cross device tracking (Also, is it cross-device tracking or cross device tracking?).  I like the background section, although you could consider splitting the Background section into another section specifically for technology, it seems like half of it is social background while the other half is about technological techniques.  You might want to double check with one of the leadership team about whether or not we should put specific numbers from studies in the wikipedia article (Applications: “Studies have shown that 234 Android applications are eavesdropping”), it might be considered as leaning a bit too hard on a specific source (this is just my take though).  I would advise being careful with jargon and acronyms; for instance, you write about MAC addresses without explaining what MAC stands for.  

Week 6 Peer Reviews[edit]

Week 6 - Peer review by Starshine44[edit]

Very thorough for your first draft, great work! The background is helpful. I didn't fully understand what the issue of the topic was until I reached the Applications section, which clearly explained what happens with these cross device transmissions. What a very interesting topic! I am eager to see how it expands and evolves. The writing is sound and free of obvious errors. It holds an encyclopedic tone, is balanced, informative, and well cited. The sections make sense, and flow well into the next subject.

Week 6 - Peer review by Tm670[edit]

Wonderful first draft! Great job. The lead section does a great job of encompassing all the information as a "thesis" for the rest of the article. Though, I am not completely still aware of what cross-device tracking is. I think the information you provide explains what it does, but not what it is. On the other hand, I think you did a great job of explaining the breadth of technology that this impacts. Your article sets up the topic well by providing a history of a "traditional" tracking such as logging into websites. However, I think it is unclear what types of companies and where they are located.

The first section does a wonderful job of explaining the types of technology and ways the tracking were executed as well. Under the Applications section, you mention Android but there is no mention of iPhone or Apple. I think including them would provide a better balance and a more comprehensive approach. You end the article with "The Future" which I think might border on making a suggestion or proposal -- so I would be cautious about wording in this section. Overall, I think you provide a fair encyclopedic tone, balance, and diction.


Week 8 Peer Reviews[edit]

Week 8 - Peer review by Relaxbear4649[edit]

Great job on your article so far! I really liked the organization of the article as it made the transition to each section very easy. The "Background" section contains a lot of useful information but I think it might be helpful to break it up into smaller sections so that the readers can digest the information in small bits. I think the use of bullet points in the "Applications" and "Privacy/surveillance issues" sections really helped with the organization of the different concepts. I also really liked that you added lots of hyperlinks and citations throughout your article! I would also love to see more information about the legal and ethical concerns of Cross-device tracking. Overall, the grammar and sentence structure looks great! Great work and I look forward to how your final article turns out!

Week 9 Peer Review: Edit4Change[edit]

Great job on your article. I learn a lot of interesting things about cross device tracking from your article and learned a lot of new terms. I would recommend simplifying certain lines, such as "More specifically, cross device tracking is a technique in which technology companies and advertisers deploy trackers, often in the form of unique identifiers, cookies, or even ultrasonic signals, to generate a profile of users across multiple devices, not simply one." This line can be simplified to "Cross device tracking is a technique in used by technology companies and advertisers. These entities deploy trackers, such as unique identifiers, cookies, or even ultrasonic signals, to generate a profile of users across multiple devices." By splitting long sentences into two parts it will make you paper an easier read and will allow you to fit in more information. Overall, you did a great job! I learned a lot from your article.

First Draft: Cross-device tracking[edit]

Cross-device tracking refers to technology which enables the tracking of users across multiple devices such as smartphones, television sets, smart TVs, and personal computers.[1] One such form of this tracking uses audio beacons, or inaudible sounds, emitted by one device and recognized through the microphone of the other device.[2]

This form of tracking is utilized primarily by technology companies and advertisers, who use this information to piece together a cohesive profile of the user, which informs and predicts the type of advertisements the user receives.[2]

Background[edit]

There are many ways in which online tracking has manifested itself online. Historically, when companies wanted to track users’ online behavior, they simply had users sign in to their website.[1] This is a form of deterministic cross-device tracking, in which the user’s devices are associated with their account credentials, such as their email or username.[3] Consequently, while the user is logged in, the company can keep a running history of what sites the user has been to and which ads the user interacted with between computers and mobile devices.[3]

Eventually, cookies were deployed by advertisers, providing each user with a unique identifier in his or her browser so that the user’s preferences can be monitored.[4] This unique identifier informs the placement of relevant, targeted ads the user may receive.[4] However, as users began utilizing multiple devices––up to around five––advertisers became confused as to how to track, manage, and consolidate this data across multiple devices as the cookie-based model suggested that each device––whether a phone, computer, or tablet––was a different person.[4]

Other technologies such as supercookies, which stay on computers long after the user deletes his or her cookies, and web beacons, which are unique images from a URL, are also used by trackers and advertisers to gain increased visibility into users’ behavior.[4] However, advertisers were still limited in that only one device was able to be tracked and associated with a user.[4]

Thus, cross-device tracking initially emerged as a means of generating a profile of users across multiple devices, not simply one.

One such tactic for cross-device tracking is called browser fingerprinting, and occurs when browsers, which are modifiable to the users’ tastes, produce a unique signal that companies or advertisers can use to single out the user.[4] Browser fingerprinting has been a cause for concern because of its effectiveness and also since it does not allow for users to opt-out of the tracking.[4]

Another tactic used by Google is called “AdID” and works on smartphones in tandem with cookies on a user’s computer to track behavior across devices.[1]

Now, cross-device tracking has evolved into a new, radical form of surveillance technology which enables users to be tracked across multiple devices, including smartphones, TVs, and personal computers through the use of audio beacons, or inaudible sound, emitted by one device and recognized through the microphone of the other device, usually a smartphone.[2]

Humans interpret sound by picking up on different frequencies.[2] Given the variety of sound waves that exist, humans can only hear frequencies that are within a certain range––generally from 20Hz to 20kHz. Interestingly, by the age of 30, most humans cannot hear sounds above 18kHz.[2]

Ultrasound, which emits shorter wavelengths greater than or equal to 20kHz, enables the rapid transmission of data necessary for cross-device tracking to occur.[2]

Another integral component of cross-device tracking is the usage of audio beacons. Audio beacons are beacons that are embedded into ultrasound, so they cannot be heard by humans.[2] These audio beacons are used to surreptitiously track a user’s location and monitor online behavior by connecting with the microphone on another device without the user’s awareness.[2]

Applications[edit]

Studies have shown that 234 Android applications are eavesdropping on these ultrasonic channels without the user’s awareness.[2]

Applications such as Silverpush, Shopkick, and Lisnr are part of an “ultrasonic side-channel” in which the app, often unbeknownst to the user, intercepts ultrasonic signals emitted from the user’s environment, such as from a TV, to track which advertisements the user has heard and how long the person listened to them[2].

  • Silverpush­­––the leading company utilizing this technology––patented software enabling them to track TV ads based on audio stream above[2]
  • Shopkick, another popular application, gives discounts to users who shop at stores which emit these ultrasonic beacons, allowing them to create a profile of the user[2]
  • Lisnr utilizes a user’s location data in tandem with ultrasonic beacons to give users coupons related to their activities[2]

Privacy/surveillance issues[edit]

These technologies can pose massive threats to users’ privacy. There are four primary privacy concerns associated with this new ultrasonic tracking.

  • The first is media tracking: audio from the user’s television may be detected by the microphone in the user’s mobile device, allowing malicious actors to gain access to what the user is watching––particularly if it is salacious[2]. Advertisers can similarly gain insight into what a user typically watches.[2] In both scenarios, a user’s real-world behavior is linked to their online identity and used for tracking.[2]
  • Another form of tracking permitted by ultrasonic tracking is cross-device tracking, which enables a user’s profile to be connected across multiple devices based on proximity.[2] This form of tracking, in linking different devices, can help advertisers show more targeted ads or open individuals to attacks by malicious actors.[2]
  • Location tracking is yet another privacy concern highlighted by Arp’s study.[2] Indeed, ultrasonic signals can convey location information via a location identifier, often placed in stores or businesses.[2]
  • Lastly, this new ultrasonic tracking poses a threat to users of Bitcoin and Tor because it deanonymizes users’ information, since ultrasonic signals associate the user’s mobile phone with the Bitcoin or Tor account.[2]

From cookies to ultrasonic trackers, some argue that invasive forms of surveillance underscore how users are trapped in a digital panopticon, similar to the concept envisioned by Jeremy Bentham: a prison in which the prisoners were able to be seen at all times by guards but were unable to detect when, or even if, they were being watched at all, creating a sense of paranoia that drove prisoners to carefully police their own behavior.[7] Similarly, scholars have drawn parallels between Bentham’s panopticon and today’s pervasive use of internet tracking in that individuals lack awareness to the vast disparities of power that exist between themselves and the corporation to which they willingly give their data.[7] In essence, companies are able to gain access to consumers’ activity when they use a company’s services.[7] The usage of these services often is beneficial, which is why users agree to exchange personal information.[7] However, since users participate in this unequal environment, in which corporations hold most of the power and in which the user is obliged to accept the bad faith offers made by the corporations, users are operating in an environment that ultimately controls, shapes and molds them to think and behave in a certain way, depriving them of privacy.[7]

The increased usage of cross-device tracking by advertisers is indicative of the rise of a new era of data extraction and analysis as a form of profit, or surveillance capitalism, a term coined by Shoshana Zuboff.[14] This form of capitalism seeks to commodify private human experience to create behavioral futures markets, in which behavior is predicted and behavioral data is harvested from the user.[14] Zuboff suggests that this new era of surveillance capitalism eclipses Bentham's panopticon, becoming far more encroaching and invasive as, unlike a prison, there is no escape, and the thoughts, feelings, and actions of users are immediately extracted to be commodified and resold.[14] Thus, since cross-device tracking seeks to create a profile of a user across multiple devices, big tech companies, such as Google, could use this behavioral data to make predictions about the user’s future behavior without the user’s awareness.[14]

In addition, health and wellness applications also have a dearth of privacy protections as well: a study found that many health apps lacked encryption and that regulators should enforce stronger data privacy protections.[15]

Legal and ethical issues[edit]

There are large implications for this technology within the legal field. Legally, The Federal Trade Commission has a responsibility to prevent deceptive practices by technology companies, such as those that could lead to consumer injury.[21] The FTC has made efforts to prevent invasive web tracking, tracking in physical space, malware, insecure and poorly designed services, and the use of deception to engage in surveillance.[21]

Ethically, Zuboff points to the extraction, commodification, and analysis of private human experiences as well as increased surveillance––which is sometimes hidden––in everyday life as violating users' rights to privacy.[14]

The future[edit]

Zuboff sees the rise of instrumentarian power––the power of companies to control, modify, and predict users' behaviors––as reflecting a new precedent for the future, as human freedom is stifled and limited by big corporations.[23]

See also[edit]


Outline for Cross-device tracking[edit]

Week 5 Article Outline: Cross-Device Tracking

I noticed that there’s a lot messing from the article on Wikipedia as it currently stands:

-      There is very limited information about the threats to privacy that this new technology poses

-      The information is a bit scattered, focusing on an FTC ruling without providing any context as to the history or purpose of cross-device tracking

-      Most of the articles are from news websites and one is from a journal, calling into question the reliability of the information used in the article

-      Mainly, the article lacks depth of content, which I hope to provide when I fully edit the article

Thus, I would improve the article by editing the lead section a bit so it looks like this, for example:


Cross-device tracking refers to technology which enables tracking of users across multiple devices such as smartphones, television sets, smart TVs, and personal computers through the usage of audio beacons, or inaudible sounds, emitted by one device and recognized through the microphone of the other device.

This form of tracking is utilized by advertisers, who use this information to piece together a cohesive understanding of the user to inform and predict the type of advertisements the user receives.


I would also improve the quality of the sources by using academic journals, providing more in-depth factual and relevant data, and also providing more context for certain occurrences while also ensuring that the article fully addresses threats to privacy.


My outline of additional sections would look like this, for example:


1.    Background/Context

  • Cross-device tracking
  • Ultrasound
  • Audio beacons

2.    Applications

  • Lisnr
  • Silverpush
  • Shopkick

3.    Privacy/Surveillance

  • Companies
  • Advertisers
  • Panoptic control
  • Surveillance capitalism
  • Profiling users

4.    Legal

  • Implications
  • Right to privacy/invasion of privacy

5.    The future

  • Where could this technology evolve/mutate from here?
  • Can we control/change this technology for good?


Related Wikipedia pages:

-      https://en.wikipedia.org/wiki/Behavioral_targeting

-      https://en.wikipedia.org/wiki/Internet_privacy

-      https://en.wikipedia.org/wiki/SilverPush

-      https://en.wikipedia.org/wiki/Surveillance

-      https://en.wikipedia.org/wiki/Website_visitor_tracking

-      https://en.wikipedia.org/wiki/Advertising#New_technology

-      https://en.wikipedia.org/wiki/Targeted_advertising

-      https://en.wikipedia.org/wiki/Surveillance_capitalism

-      https://en.wikipedia.org/wiki/Privacy_concerns_with_social_networking_services

-      https://en.wikipedia.org/wiki/Behavioral_analytics


Cross-device tracking[edit]

As someone deeply fascinated by the panoptic surveillance mechanisms used by today's technology companies, cross-device tracking offers a perfect topic on which I can explore the increasingly embedded tracking mechanisms utilized by large technology companies to exploit and monetize our data while also influencing our thoughts and behavior via targeted advertisements. Thus, I plan to contribute much more specific information concerning the implications of cross-device tracking, such as on people’s behavior, habits, location, and even the type of advertisements they receive. I believe that after immersing myself in the literature more, I will be able to more fully flesh out and delineate criteria I can use to create a cohesive Wikipedia article on the topic.


A rough outline of my article would look something like this:


Cross-device tracking

-      Ultrasonic signal tracking

-      Behavioral data mining and surveillance

-      Location Tracking

-      Surveillance of habits and purchases

-      New technologies and corporate surveillance

Article evaluation: Surveillance Capitalism[edit]

Content[edit]

  • Is everything in the article relevant to the article topic? Is there anything that distracted you?
    • Yes, everything in the article is relevant to the topic and nothing distracted me.
  • Is any information out of date? Is anything missing that could be added?
    • While the information does not seem out of date, the concept of Surveillance Capitalism as created by Shoshana Zuboff needs to be fleshed out and developed more than it currently is. Specifically going into depth on the main points of her book might be a start.
  • What else could be improved?
    • Aside from further fleshing out and developing the article, I'd increase and improve the citations. One of the citations states that a "better source [is] needed," so I would definitely address that.

Tone[edit]

  • Is the article neutral? Are there any claims that appear heavily biased toward a particular position?
    • The article seems neutral and none of the claims appeared as if they were biased toward a particular position.
  • Are there viewpoints that are overrepresented, or underrepresented?
    • I think more of Zuboff's own work and ideas could be contributed to the article as well as perhaps more viewpoints as to her impact within the business and technological realms.

Sources[edit]

  • Check a few citations. Do the links work? Does the source support the claims in the article?
    • The links do work and the sources support the claims in the article.
  • Is each fact referenced with an appropriate, reliable reference? Where does the information come from? Are these neutral sources? If biased, is that bias noted?
    • Each fact is referenced with appropriate, reliable references, though I'd hope that more could be added and more sources/information could be incorporated into the article. Additionally, as I noted earlier, one of the citations needs to be verified, so that needs to be looked into. The information seems to largely be coming from journal articles and newspaper articles from institutions with a high reputation for quality content, such as WIRED and the New York Times.

Talk Page[edit]

  • What kinds of conversations, if any, are going on behind the scenes about how to represent this topic?
    • Behind the scenes, there appears to be a running debate about whether Zuboff "coined or used/popularized" surveillance capitalism. This debate appears to be the main focus of the talk page thus far.
  • How is the article rated? Is it a part of any WikiProjects?
    • The article is rated as start-class, high importance on WikiProject Mass surveillance, start-class, low-importance on WikiProject Business, start-class mid-importance on WikiProject Economics, start-class in WikiProjects Politics, rated start-class in WikiProject law, start-class in Wiki-Project international law, start-class in WikiProject globalization, start-class in WikiProject sociology, start-class in WikiProject computing, start-class on WikiProject history, start-class, mid-importance on WikiProject Internet, start-class on WikiProject internet culture, start-class on WikiProject futures studies, start-class on WikiProject google, and start-class, High-importance on WikiProject Capitalism.[25]
  • How does the way Wikipedia discusses this topic differ from the way we've talked about it in class?
    • We haven't yet discussed this topic in class, but this article makes clear that it is in part about the extent to which companies knowingly exploit the capitalist system of expansion and monetization of users data, which infringes on people's inherent right to privacy.

Article evaluation: Information Privacy[edit]

Content[edit]

  • Is everything in the article relevant to the article topic? Is there anything that distracted you?
    • Yes, this article is very relevant to the article topic. Nothing distracted me.
  • Is any information out of date? Is anything missing that could be added?
    • The article does need to be updated to have more relevant and comprehensive answers. There's little information about the assaults on privacy by the US government via COINTELPRO in the past or via stingrays today, for instance.
  • What else could be improved?
    • I feel as though the topic is slightly underdeveloped. Given the massive amount of information available about information privacy, I feel as though the Wikipedia article could have included more information so as to give readers a more comprehensive understanding of the extent to which privacy is entangled within our everyday lives.

Tone[edit]

  • Is the article neutral? Are there any claims that appear heavily biased toward a particular position?
    • The article appears relatively neutral and no claims appear heavily biased toward a particular position.
  • Are there viewpoints that are overrepresented, or underrepresented?
    • The article refers extensively to the Safe Harbor program, but fails to discuss other aspects of information privacy such as human rights or the right to privacy within the U.S. in depth, for instance.

Sources[edit]

  • Check a few citations. Do the links work? Does the source support the claims in the article?
    • The links work and the sources do support the claims made in the article. However, many of the links go to other Wikipedia pages that talk more in depth on the issues highlighted in the main page. While I think this is an effective tool to have in Wikipedia, I believe this article relies too heavily on these links and instead should work to further research and develop what's already there.
  • Is each fact referenced with an appropriate, reliable reference? Where does the information come from? Are these neutral sources? If biased, is that bias noted?
    • While some areas of the article are referenced with appropriate, reliable references, other areas, such as the section on financial privacy, have no citations at all. The sources themselves appear to largely be neutral and adequate. One of the sources, however, is from enotes, which is likely to contain unverified information and be biased.

Talk Page[edit]

  • What kinds of conversations, if any, are going on behind the scenes about how to represent this topic?
    • The most recent conversations discuss and debate switching the title to "informational privacy" and incorporating privacy protection in India and China.
  • How is the article rated? Is it a part of any WikiProjects?
    • The article was given a "C-class" rating and "high-importance" on the WikiProject Computing, WikiProject Internet, and WikiProject Mass Surveillance pages.
  • How does the way Wikipedia discusses this topic differ from the way we've talked about it in class?
    • The Wikipedia discussion reflected a diverse set of very knowledgeable people who seemed well versed in Wikipedia's core policies. Our class, on the other hand, is still learning how to use Wikipedia and are thus still acclimating ourselves to Wikipedia's editing platform and core policies.


References[edit]

  1. ^ a b c d e f g h i j k l m n o p q r Jebara, Tony; Bellovin, Steven M.; Kim, Hyungtae; Li, Jie S.; Zimmeck, Sebastian (2017). "A Privacy Analysis of Cross-device Tracking": 1391–1408. {{cite journal}}: Cite journal requires |journal= (help)
  2. ^ a b c d e f g h i j k l m n o p q r s t u v w x y z aa ab ac ad ae af ag ah ai aj ak al am an ao ap aq ar as at au av aw ax ay az ba bb bc bd be bf bg bh bi bj bk bl bm bn bo bp bq br bs bt bu bv bw bx by bz ca cb cc cd ce cf cg ch ci cj ck cl cm cn co cp cq cr cs ct cu cv cw cx cy cz da db dc dd Arp, Daniel. "Privacy Threats through Ultrasonic Side Channels on Mobile Devices". IEEE European Symposium on Security and Privacy: 1–13 – via IEEE Xplore.
  3. ^ a b c d e f g h i j Brookman, Justin (2017). "Cross-Device Tracking: Measurement and Disclosures" (PDF). Proceedings on Privacy Enhancing Technologies. 2017 (2): 133–148. doi:10.1515/popets-2017-0020. S2CID 2101512.
  4. ^ a b c d e f g h i j k l m n o p q r s t u v w x y z aa ab ac ad ae af ag ah ai "Comments for November 2015Workshop on Cross-Device Tracking" (PDF).
  5. ^ a b c d e f g h i j k l m n o p q r s t u v w x y z aa ab ac ad ae af ag ah ai aj ak al am Sipior, Janice C.; Ward, Burke T.; Mendoza, Ruben A. (2011-03-30). "Online Privacy Concerns Associated with Cookies, Flash Cookies, and Web Beacons". Journal of Internet Commerce. 10 (1): 1–16. doi:10.1080/15332861.2011.558454. ISSN 1533-2861.
  6. ^ a b c d Korolova, Aleksandra; Sharma, Vinod (2018). "Cross-App Tracking via Nearby Bluetooth Low Energy Devices". Proceedings of the Eighth ACM Conference on Data and Application Security and Privacy. CODASPY '18. New York, NY, USA: ACM: 43–52. doi:10.1145/3176258.3176313. ISBN 9781450356329. S2CID 3933311.
  7. ^ a b c d e f g h i j k l m n o p q r s t u v w x y Campbell, John Edward; Carlson, Matt (2002). "Panopticon.com: Online Surveillance and the Commodification of Privacy". Journal of Broadcasting & Electronic Media. 46 (4): 586–606. doi:10.1207/s15506878jobem4604_6. ISSN 0883-8151. S2CID 144277483.
  8. ^ a b c d e f Wellman, Barry; Nolan, Jason; Mann, Steve (2003). "Sousveillance: Inventing and Using Wearable Computing Devices for Data Collection in Surveillance Environments". Surveillance & Society. 1 (3): 331–355. doi:10.24908/ss.v1i3.3344. ISSN 1477-7487.
  9. ^ a b c d e f g h i j Rosen, Christine (2004). "The Age of Egocasting". The New Atlantis (7): 51–72. ISSN 1543-1215. JSTOR 43152146.
  10. ^ a b c d e f Koskela, Hille (2004). "Webcams, TV Shows and Mobile phones: Empowering Exhibitionism". Surveillance & Society. 2 (2/3). doi:10.24908/ss.v2i2/3.3374. ISSN 1477-7487.
  11. ^ a b c d e f g h i Molz, Jennie Germann (2006). "'Watch us wander': mobile surveillance and the surveillance of mobility". Environment and Planning A. 38 (2): 377–393. doi:10.1068/a37275. ISSN 0308-518X. S2CID 145772112.
  12. ^ a b c Razaghpanah, Abbas; Nithyanand, Rishab; Vallina-Rodriguez, Narseo; Sundaresan, Srikanth; Allman, Mark; Kreibich, Christian; Gill, Phillipa. "Apps, Trackers, Privacy and Regulators: A Global Study of the Mobile Tracking Ecosystem". www.icsi.berkeley.edu. Retrieved 2019-04-11.
  13. ^ a b c d e f g h i j k l m n o p q r s t Malgieri, Gianclaudio; Bart Custers. "ScienceDirect". www.sciencedirect.com. Retrieved 2019-04-11.
  14. ^ a b c d e f g h i j k l m n o p q r s t u v w x y Zuboff, Shoshana (2015). "Big other: Surveillance Capitalism and the Prospects of an Information Civilization". Journal of Information Technology. 30 (1): 75–89. doi:10.1057/jit.2015.5. ISSN 0268-3962. S2CID 15329793.
  15. ^ a b c d e f g h i j k l m Huckvale, Kit; Prieto, José Tomás; Tilney, Myra; Benghozi, Pierre-Jean; Car, Josip (2015-09-25). "Unaddressed privacy risks in accredited health and wellness apps: a cross-sectional systematic assessment". BMC Medicine. 13 (1): 214. doi:10.1186/s12916-015-0444-y. PMC 4582624. PMID 26404673.{{cite journal}}: CS1 maint: unflagged free DOI (link)
  16. ^ a b c d e f g h i j k l "ScienceDirect". www.sciencedirect.com. Retrieved 2019-04-04.
  17. ^ a b c d e f g h Lyall, Ben; Robards, Brady (2018-03-01). "Tool, toy and tutor: Subjective experiences of digital self-tracking". Journal of Sociology. 54 (1): 108–124. doi:10.1177/1440783317722854. ISSN 1440-7833. S2CID 149319901.
  18. ^ a b c d e f g h Gross, Shad; Bardzell, Jeffrey; Bardzell, Shaowen; Stallings, Michael (2017-11-02). "Persuasive Anxiety: Designing and Deploying Material and Formal Explorations of Personal Tracking Devices". Human–Computer Interaction. 32 (5–6): 297–334. doi:10.1080/07370024.2017.1287570. ISSN 0737-0024. S2CID 2557583.
  19. ^ a b c d e f g h i j k l m n o p q r s t u West, Sarah Myers (2017-07-05). "Data Capitalism: Redefining the Logics of Surveillance and Privacy". Business & Society. 58 (1): 20–41. doi:10.1177/0007650317718185. ISSN 0007-6503. S2CID 157945904.
  20. ^ a b c d e f g h i j k l "A Contextual Approach to Privacy Online". American Academy of Arts & Sciences. Retrieved 2019-04-18.
  21. ^ a b c d e f g h i j k l m n o p q r s t u v w x y z aa ab ac ad ae af ag ah ai aj ak al Hoofnagle, Chris Jay (2017-09-01). "FTC Regulation of Cybersecurity and Surveillance". Rochester, NY. SSRN 3010205. {{cite journal}}: Cite journal requires |journal= (help)
  22. ^ a b c d e f g h i j k l m n o Fairfield, Joshua A.T. (2012). "Mixed Reality: How the Laws of Virtual Worlds Govern Everyday Life". Berkeley Technology Law Journal. 27 (1): 55–116. ISSN 1086-3818. JSTOR 24119476.
  23. ^ a b c d e Zuboff, Shoshana (2019). The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. Hachette Book Group. pp. 1–691.
  24. ^ a b c d e f g h "ScienceDirect". www.sciencedirect.com. Retrieved 2019-04-04.
  25. ^ "Surveillance capitalism", Wikipedia, 2019-02-17, retrieved 2019-02-28