top of page
Search
riomitepas1973

Facebook Facial Recognition faces legal challenges, while Google avoids scrutiny



From the historical surveillance of civil rights leaders by the Federal Bureau of Investigation (FBI) to the current misuse of facial recognition technologies, surveillance patterns often reflect existing societal biases and build upon harmful and virtuous cycles. Facial recognition and other surveillance technologies also enable more precise discrimination, especially as law enforcement agencies continue to make misinformed, predictive decisions around arrest and detainment that disproportionately impact marginalized populations.


In this paper, we present the case for stronger federal privacy protections with proscriptive guardrails for the public and private sectors to mitigate the high risks that are associated with the development and procurement of surveillance technologies. We also discuss the role of federal agencies in addressing the purposes and uses of facial recognition and other monitoring tools under their jurisdiction, as well as increased training for state and local law enforcement agencies to prevent the unfair or inaccurate profiling of people of color. We conclude the paper with a series of proposals that lean either toward clear restrictions on the use of surveillance technologies in certain contexts, or greater accountability and oversight mechanisms, including audits, policy interventions, and more inclusive technical designs.




Facebook Facial Recognition hostility, not for Google




Although suspicion toward communities of color has historical roots that span decades, new developments like facial recognition technologies (FRT) and machine learning algorithms have drastically enlarged the precision and scope of potential surveillance.14 Federal, state, and local law enforcement agencies often rely upon tools developed within the private sector, and, in certain cases, can access massive amounts of data either stored on private cloud servers or hardware (e.g., smartphones or hard drives) or available in public places like social media or online forums.15 In particular, several government agencies have purchased access to precise geolocation history from data aggregators that compile information from smartphone apps or wearable devices. In the general absence of stronger privacy protections at the federal or state levels to account for such advancements in technology, enhanced forms of surveillance used by police officers pose significant risks to civilians already targeted in the criminal justice system and further the historical biases affecting communities of color. Next, we present tangible examples of how the private and public sectors both play a critical role in amplifying the reach of law enforcement through facial recognition and other surveillance technologies.


Facial recognition has become a commonplace tool for law enforcement officers at both the federal and municipal levels. Out of the approximately 42 federal agencies that employ law enforcement officers, the Government Accountability Office (GAO) discovered in 2021 that about 20, or half, used facial recognition. In 2016, Georgetown Law researchers estimated that approximately one out of four state and local law enforcement agencies had access to the technology.16


But Clearview AI is only one of numerous private companies that U.S. government agencies partner with to collect and process personal information.19 Another example is Vigilant Solutions, which captures image and location information of license plates from billions of cars parked outside homes, stores, and office buildings, and which had sold access to its databases to approximately 3,000 local law enforcement agencies as of 2016.20 Vigilant also markets various facial recognition products like FaceSearch to federal, state, and local law enforcement agencies; its customer base includes the DOJ and DHS, among others.21 A third company, ODIN Intelligence, partners with police departments and local government agencies to maintain a database of individuals experiencing homelessness, using facial recognition to identify them and search for sensitive personal information such as age, arrest history, temporary housing history, and known associates.22


In the end, it is virtually impossible for an individual to fully opt out of facial recognition identification or control the use of their images without abstaining from public areas, the internet, or society altogether.


As both the government and private corporations feed into the problem of surveillance, gaps in current federal and state privacy laws mean that their actions to collect, use, or share data often go unchallenged. In other words, existing laws do not adequately protect user privacy among the rising ubiquity of facial recognition and other emerging technologies, fundamentally omitting the needs of communities of color that disproportionately bear the consequences of surveillance. To reduce the potential for emerging technologies to replicate historical biases in law enforcement, we summarize recent proposals that address racial bias and unequal applications of technology in the public sector. We also explain why U.S. federal privacy legislation is necessary to govern how private sector companies implement fairness in the technical development process, limit their data collection and third-party sharing, and grant more agency to the individuals they surveil.


Amazon refused to cede to employee demands over facial recognition software pilot programs with the two police departments, but the Orlando Police Department has decided to drop the software program. During the furor, Amazon Web Services defended its motives in a statement to the press that said, "Our quality of life would be much worse today if we outlawed new technology because some people could choose to abuse the technology."


Since the face is one of the most important cues in social interaction, there has also been accumulating evidence that the hostile attribution bias leads to a characteristic misperception of facial expressions. For example, inmates diagnosed with antisocial personality disorder or psychopathy have been found to show deficits in emotion expression recognition [7,8,9]. While hostile intentions could in theory be ascribed to any ambiguous facial expression, the bias seems to be triggered most strongly when the expression contains some amount of anger [10].


There was also a main effect for face gender, in that the expressions of female faces were easier to recognize, across all participant groups (Table 3). This was especially true for disgust and sadness, as indicated by the face gender by expression interaction, as these were significantly easier to recognize in the female models. Overall, the results indicate that no inmate group showed grossly impaired recognition of full-blown facial expressions.


Facial expression recognition has been studied extensively, including in relation to social anxiety. Nonetheless, a limited number of studies examined recognition of disgust expressions. Results suggest that disgust is perceived as more threatening than anger, and thus may invite more extreme responses. However, few studies have examined responses to facial expressions. These studies have focused on approach-avoidance responses. Our primary aim was to examine to what extent anger and disgust expressions might invite interpersonal responses in terms of quarrelsomeness-agreeableness and dominance-submissiveness. As social anxiety has been previously associated with a heightened sensitivity to anger and disgust expressions, as well as with alterations in quarrelsomeness-agreeableness and dominance-submissiveness, our secondary aim was to examine whether social anxiety would moderate these responses.


We addressed some limitations of previous studies using the same computer task [1, 17]: we only included participants whose mother tongue matched the language of the task and we used face stimuli that were more recent than the previously used Picture of Facial Affect Series [43]. One remaining task-related limitation is that we assessed how individuals might behave in response to facial expressions presented as static images of unknown targets, rather than how individuals actually behave towards real-life others, who are often no strangers. However, this is also done in approach-avoidance studies [2, 13, 29, 30]. Another task-related limitation is that we assessed responses to emotional expressions without verifying whether these expressions were recognized accurately. However, we also did this in our past studies and we note that while anger tends to be misinterpreted as disgust and vice versa, there is no evidence that a lower resolution of the expression affects disgust recognition more than anger recognition [44]. Besides, emotion recognition does not require the conscious processing of facial expressions [45] and is not required for appropriate social interaction.


AI is highly privacy disruptive because it underpins new surveillance capabilities in the physical world. Video-based surveillance and facial recognition are on a collision course with our sense of freedom. While tech companies market the story that surveillance is the only way to be safe, individuals will not buy this once they feel personally threatened by the technology. China\u2019s increasing use of the technology will give rise to a heightened sense of moral non-equivalency \u2014 freedom versus AI.


2020: The year we experience a backlash against neighborhood surveillance. Multiple cities will ban facial recognition in policing. Amazon will see its favored status as \u201Cmost trusted tech company\u201D ceded to Microsoft as a result of its Ring partnerships and hands-off approach to the uses of its facial recognition product, Rekognition.


2030: A patchwork of local laws and regulations around AI-based surveillance and facial recognition will finally result in a standard set of federal laws which protect individual rights and punish abusers of AI\u2019s technological capabilities. 2ff7e9595c


0 views0 comments

Recent Posts

See All

2 Anos Apk Antigo

Tabela 2: Artigo com formatação HTML Vector 2 APK Full: Um Emocionante Jogo de Parkour para Android Se você está procurando um jogo...

Comments


bottom of page