Kate Crawford Quotes

Powerful Kate Crawford for Daily Growth

About Kate Crawford

Kate Crawford is an influential Australian scholar, artist, and researcher in the field of artificial intelligence (AI) and computational culture. Born on October 14, 1978, in Melbourne, Australia, she grew up with a strong interest in technology and its cultural implications, which would later shape her career. After completing her Bachelor of Arts at the University of Melbourne, Crawford pursued her Master's degree in Media Arts and Communications from the Royal Melbourne Institute of Technology (RMIT). She then moved to the United States for further studies, earning a PhD in Culture and Communication from the University of Sydney's joint program with the Massachusetts Institute of Technology (MIT). Crawford's work is deeply influenced by her research on labor, gender, race, and global politics in the context of AI. Her major works include "An Temporary Autonomous Zone, An Inoperative Community" (co-authored with Jason Jones) and "The Circulation of Data in Artistic Practice" (co-authored with Trevor Paglen). In 2017, Crawford co-authored the influential book "Anthropological Intelligence: Racing Ahead in the AI Revolution," which explores the role of anthropology in the development and deployment of AI. Her current research focuses on the social and political implications of AI, with a particular emphasis on the global South. Crawford is a Professor at the MIT's Department of Media Arts and Sciences and the Department of Anthropology. She has received numerous awards and recognitions for her work, including the Berlin Prize Fellowship from the American Academy in Berlin and a Guggenheim Fellowship. Her work continues to push boundaries in understanding the intersection of technology, culture, and society, and she is widely regarded as one of the leading voices in AI research and ethics.

Interpretations of Popular Quotes

"Data is not neutral; it carries with it the biases and power dynamics of the society in which it was created."

This quote suggests that data, which forms the foundation of modern technology and AI, is not free from human influence or bias. The information collected reflects societal norms, beliefs, and power structures, often perpetuating inequalities and reinforcing stereotypes. It emphasizes the importance of acknowledging these underlying factors when creating and using data, to ensure fairness, transparency, and ethical practices in technology development and decision-making.


"Artificial intelligence systems are not just tools that we can pick up and wield as we please. They are products of complex social, political, and economic processes that require our careful attention and scrutiny."

This quote emphasizes the inherent complexity of Artificial Intelligence (AI) systems. It suggests that AI is not merely a neutral tool to be used at will, but rather a product shaped by intricate social, political, and economic factors. The author encourages us to critically examine these underlying processes when engaging with AI, as they play a significant role in shaping its development, use, and impact on society. This perspective calls for a thoughtful approach to the design, implementation, and application of AI technologies to ensure their benefits are equitably distributed and potential risks are mitigated.


"Machine learning has a powerful effect on the way knowledge is produced and distributed in society."

This quote suggests that machine learning significantly influences how knowledge is created and disseminated within our society. By automating processes traditionally performed by humans, machine learning alters the dynamics of knowledge production, making it faster, more efficient, and often more accessible. However, this can also lead to power imbalances as those with access to these technologies may control or shape the knowledge that becomes widely available. Additionally, the lack of transparency in many machine learning models can obscure how knowledge is produced, potentially skewing public understanding. Overall, it emphasizes the need for careful examination and ethical consideration when implementing machine learning systems to ensure equitable knowledge distribution and avoid perpetuating existing biases or creating new ones.


"When we design AI systems, we need to consider not only their technical capabilities but also their broader social and cultural implications."

This quote by Kate Crawford emphasizes the importance of approaching Artificial Intelligence (AI) development holistically. It highlights that while AI's technological aspects are crucial, they should never be the sole focus. Instead, we must consider its wider social and cultural impacts as well. In other words, it's not enough to make an AI system that performs tasks efficiently; we must also ensure that it aligns with our societal values, respects human rights, and does not exacerbate existing inequalities or create new ones. By acknowledging and addressing these implications, we can design AI systems that are not only useful but also ethical and beneficial to all members of society.


"We must ask: who benefits from these technologies? Who is excluded? And what values do they reinforce or challenge?"

This quote emphasizes the importance of considering the social, ethical, and economic impacts of technology development. By questioning "who benefits" and "who is excluded," Crawford encourages us to prioritize inclusivity and equity in technological advancements. Moreover, she highlights that technologies reflect and reinforce certain societal values, suggesting that we should also evaluate how they challenge or uphold those values, and strive for positive change if necessary.


There's been the emergence of a philosophy that big data is all you need. We would suggest that, actually, numbers don't speak for themselves.

- Kate Crawford

Data, Big, Been, Emergence

Books about technology start-ups have a pattern. First, there's the grand vision of the founders, then the heroic journey of producing new worlds from all-night coding and caffeine abuse, and finally, the grand finale: immense wealth and secular sainthood. Let's call it the Jobs Narrative.

- Kate Crawford

Abuse, About, Worlds, Finale

People think 'big data' avoids the problem of discrimination because you are dealing with big data sets, but, in fact, big data is being used for more and more precise forms of discrimination - a form of data redlining.

- Kate Crawford

Think, Fact, Big, Being Used

If you're not thinking about the way systemic bias can be propagated through the criminal justice system or predictive policing, then it's very likely that, if you're designing a system based on historical data, you're going to be perpetuating those biases.

- Kate Crawford

Through, Very, Systemic, Predictive

If we start to use social media data sets to take the pulse of a nation or understand a crisis - or actually use it to deploy resources - we are getting a skewed picture of what is happening.

- Kate Crawford

Data, Crisis, Sets, Pulse

Only by developing a deeper understanding of AI systems as they act in the world can we ensure that this new infrastructure never turns toxic.

- Kate Crawford

New, Toxic, Ensure, AI

As AI becomes the new infrastructure, flowing invisibly through our daily lives like the water in our faucets, we must understand its short- and long-term effects and know that it is safe for all to use.

- Kate Crawford

New, Through, Lives, AI

The amount of money and industrial energy that has been put into accelerating AI code has meant that there hasn't been as much energy put into thinking about social, economic, ethical frameworks for these systems. We think there's a very urgent need for this to happen faster.

- Kate Crawford

Been, Code, Very, AI

Hidden biases in both the collection and analysis stages present considerable risks and are as important to the big-data equation as the numbers themselves.

- Kate Crawford

Numbers, Hidden, Equation, Stages

Biases and blind spots exist in big data as much as they do in individual perceptions and experiences. Yet there is a problematic belief that bigger data is always better data and that correlation is as good as causation.

- Kate Crawford

Blind, Big, Always, Perceptions

Surveillant anxiety is always a conjoined twin: The anxiety of those surveilled is deeply connected to the anxiety of the surveillers. But the anxiety of the surveillers is generally hard to see; it's hidden in classified documents and delivered in highly coded languages in front of Senate committees.

- Kate Crawford

Hidden, Always, Languages, Committees

The promoters of big data would like us to believe that behind the lines of code and vast databases lie objective and universal insights into patterns of human behavior, be it consumer spending, criminal or terrorist acts, healthy habits, or employee productivity. But many big-data evangelists avoid taking a hard look at the weaknesses.

- Kate Crawford

Habits, Big, Behind, Promoters

Data will always bear the marks of its history. That is human history held in those data sets.

- Kate Crawford

Data, Always, Sets, Human History

Sexism, racism, and other forms of discrimination are being built into the machine-learning algorithms that underlie the technology behind many 'intelligent' systems that shape how we are categorized and advertised to.

- Kate Crawford

Racism, Behind, Other, Categorized

Rather than assuming Terms of Service are equivalent to informed consent, platforms should offer opt-in settings where users can choose to join experimental panels. If they don't opt in, they aren't forced to participate.

- Kate Crawford

Rather, Equivalent, Forced, Platforms

Histories of discrimination can live on in digital platforms, and if they go unquestioned, they become part of the logic of everyday algorithmic systems.

- Kate Crawford

Digital, Part, Unquestioned, Platforms

Error-prone or biased artificial-intelligence systems have the potential to taint our social ecosystem in ways that are initially hard to detect, harmful in the long term, and expensive - or even impossible - to reverse.

- Kate Crawford

Social, Ecosystem, Biased, Harmful

Data and data sets are not objective; they are creations of human design. We give numbers their voice, draw inferences from them, and define their meaning through our interpretations.

- Kate Crawford

Voice, Give, Through, Inference

It is a failure of imagination and methodology to claim that it is necessary to experiment on millions of people without their consent in order to produce good data science.

- Kate Crawford

Imagination, Data, Necessary, Claim

Many of us now expect our online activities to be recorded and analyzed, but we assume the physical spaces we inhabit are different. The data broker industry doesn't see it that way. To them, even the act of walking down the street is a legitimate data set to be captured, catalogued, and exploited.

- Kate Crawford

Data, Industry, Our, Spaces

When dealing with data, scientists have often struggled to account for the risks and harms using it might inflict. One primary concern has been privacy - the disclosure of sensitive data about individuals, either directly to the public or indirectly from anonymised data sets through computational processes of re-identification.

- Kate Crawford

Data, Through, Been, Primary

Big Data is neither color-blind nor gender-blind. We can see how it is used in marketing to segment people.

- Kate Crawford

Big, How, Nor, Segment

While massive datasets may feel very abstract, they are intricately linked to physical place and human culture. And places, like people, have their own individual character and grain.

- Kate Crawford

Individual, May, Very, Grain

Like all technologies before it, artificial intelligence will reflect the values of its creators. So inclusivity matters - from who designs it to who sits on the company boards and which ethical perspectives are included.

- Kate Crawford

Values, Creators, Before, Perspectives

Self-tracking using a wearable device can be fascinating.

- Kate Crawford

Fascinating, Using, Device

We need to be vigilant about how we design and train these machine-learning systems, or we will see ingrained forms of bias built into the artificial intelligence of the future.

- Kate Crawford

Will, Need, Built, Forms

If you have rooms that are very homogeneous, that have all had the same life experiences and educational backgrounds, and they're all relatively wealthy, their perspective on the world is going to mirror what they already know. That can be dangerous when we're making systems that will affect so many diverse populations.

- Kate Crawford

Mirror, Very, Rooms, Diverse

As we move into an era in which personal devices are seen as proxies for public needs, we run the risk that already-existing inequities will be further entrenched. Thus, with every big data set, we need to ask which people are excluded. Which places are less visible? What happens if you live in the shadow of big data sets?

- Kate Crawford

Shadow, Data, Big, Entrenched

The fear isn't that big data discriminates. We already know that it does. It's that you don't know if you've been discriminated against.

- Kate Crawford

Data, Big, Been, Discriminated

We urgently need more due process with the algorithmic systems influencing our lives. If you are given a score that jeopardizes your ability to get a job, housing, or education, you should have the right to see that data, know how it was generated, and be able to correct errors and contest the decision.

- Kate Crawford

Education, Data, Correct, Errors

If you're searching for quotes on a different topic, feel free to browse our Topics page or explore a diverse collection of quotes from various Authors to find inspiration.