Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Technology and Society: Ethical Considerations and Challenges, Exams of Cybercrime, Cybersecurity and Data Privacy

The complex relationship between technology and society, examining various perspectives on their interaction. It delves into ethical considerations surrounding technology, including social responsibility, inequality, and the power of big tech. The document also discusses the importance of ethical design and development, highlighting the need for transparency and accountability in technological innovation. It further explores the impact of technology on human nature, the dangers of technological determinism, and the role of technocracy in shaping our technological future.

Typology: Exams

2024/2025

Available from 12/15/2024

Fortis-In-Re
Fortis-In-Re 🇺🇸

1

(1)

2.3K documents

1 / 22

Toggle sidebar

This page cannot be seen from the preview

Don't miss anything!

bg1
Cybr 392
Study online at https://quizlet.com/_du2v5q
1. Three Approaches to viewing the relationship between technology and
society: Luddite: a person who is averse to technology or technological progress
Techno-optimist: Someone who thinks technology will solve all the problems
Social Transformationalist: society will change through its interaction with technolo-
gy.
o Changes take place at different scales of analysis: individuals, society, democracy,
nation-states, and globally.
o New forms of exclusion and inclusion emerge.
o Social relations of power and inequality evolve.
2. Social Responsibility and ethics: Social responsibility and ethics are closely
intertwined concepts that guide individuals, organizations, and societies in making
decisions and taking actions that consider the broader impact on the well-being of
society and the environment
3. rational thinking: thinking that relies on careful reasoning and objective analysis
4. how inequality is fundamentally an ethical issue: Inequality is fundamentally
an ethical issue because it involves questions of fairness, justice, and the distribu-
tion of resources and opportunities in society. It raises moral questions about the
treatment of individuals and groups, and it has profound implications for human
well-being, social cohesion, and the overall health of a society
Human Dignity: Ethical discussions often emphasize the inherent dignity and worth
of every individual. Extreme inequality can undermine human dignity when some
individuals are denied basic necessities, opportunities, or rights simply because of
their socioeconomic status, race, gender, or other characteristics.
Rights and Equality: Many ethical and human rights frameworks emphasize the
principle of equality. Inequality can lead to the violation of basic human rights,
such as the right to education, healthcare, and a decent standard of living. Ethical
concerns arise when these rights are not equally accessible to all members of
society.
Ethical Responsibility: Individuals, organizations, and governments have ethical
responsibilities to address inequality. This includes policies and actions aimed at
reducing disparities and ensuring that everyone has a fair opportunity to lead a
decent life.
Global Inequality: In a globalized world, inequality is not limited to national bound-
aries. Ethical discussions extend to the global level, where questions of justice,
fairness, and the responsibilities of affluent nations to address global poverty and
disparities come into play.
1 / 22
pf3
pf4
pf5
pf8
pf9
pfa
pfd
pfe
pff
pf12
pf13
pf14
pf15
pf16

Partial preview of the text

Download Technology and Society: Ethical Considerations and Challenges and more Exams Cybercrime, Cybersecurity and Data Privacy in PDF only on Docsity!

Study online at https://quizlet.com/_du2v5q

  1. Three Approaches to viewing the relationship between technology and society: Luddite: a person who is averse to technology or technological progress Techno-optimist: Someone who thinks technology will solve all the problems Social Transformationalist: society will change through its interaction with technolo- gy. o Changes take place at different scales of analysis: individuals, society, democracy, nation-states, and globally. o New forms of exclusion and inclusion emerge. o Social relations of power and inequality evolve.
  2. Social Responsibility and ethics: Social responsibility and ethics are closely intertwined concepts that guide individuals, organizations, and societies in making decisions and taking actions that consider the broader impact on the well-being of society and the environment
  3. rational thinking: thinking that relies on careful reasoning and objective analysis
  4. how inequality is fundamentally an ethical issue: Inequality is fundamentally an ethical issue because it involves questions of fairness, justice, and the distribu- tion of resources and opportunities in society. It raises moral questions about the treatment of individuals and groups, and it has profound implications for human well-being, social cohesion, and the overall health of a society Human Dignity: Ethical discussions often emphasize the inherent dignity and worth of every individual. Extreme inequality can undermine human dignity when some individuals are denied basic necessities, opportunities, or rights simply because of their socioeconomic status, race, gender, or other characteristics. Rights and Equality: Many ethical and human rights frameworks emphasize the principle of equality. Inequality can lead to the violation of basic human rights, such as the right to education, healthcare, and a decent standard of living. Ethical concerns arise when these rights are not equally accessible to all members of society. Ethical Responsibility: Individuals, organizations, and governments have ethical responsibilities to address inequality. This includes policies and actions aimed at reducing disparities and ensuring that everyone has a fair opportunity to lead a decent life. Global Inequality: In a globalized world, inequality is not limited to national bound- aries. Ethical discussions extend to the global level, where questions of justice, fairness, and the responsibilities of affluent nations to address global poverty and disparities come into play.

Study online at https://quizlet.com/_du2v5q

  1. Ethics washing: Idea that large companies use AI algorithms to create ethical AI initiatives to enhance their reputation for Pr reasons Ex. Oil companies claiming its clean energy Basically use it to come off as ethical
  2. Power of Big Tech in Section 230 of the 1996 US Communications and De- cency Act: provides certain legal protections to online platforms and has significant implications for the power of big tech companies. "26 words that created the internet," Section 230 has two key components: --No Liability for Third-Party Content: grants immunity to online platforms from liability for content posted by third-party users. In other words, if a user posts defamatory, offensive, or illegal content on a platform, the platform itself is not held legally responsible for that content.Authority to Moderate Content: It allows online platforms to moderate and remove content they consider objectionable or violative of their content guidelines without losing their immunity from legal liability.
  3. Why should we care about technology from an ethical standpoint: Technol- ogy is part of what makes us human Some of us create technology... --...but we all use technology! --We also have technology used on us, sometimes without our consent Technology is a key interface: --Between citizens and governments --Between companies and consumers --Between humans and the planet Technology shapes our lives and those of other people we come into contact with (directly or indirectly) --This can be liberating --This can also entrench asymmetries of power (inequalities)
  4. Is technology neutral?: "If it's neutral, it's not technology" Most humans will not get to decide how to use technology, most of us are on the receiving end of how others program technology to shape aspects of our lives

Study online at https://quizlet.com/_du2v5q

  • Design ethics• Ethics of algorithms• AI ethics, ethics of robotics, human-robotinteraction• Ethics of drone technology• "Tech ethics is not simply a checklist that wecan delegate to one person or team to doonce, or even a few times a year, and thenforget about. It is a mindset. It is aconversation that is interdisciplinary, dynamicand iterative - rather like technologi- calinnovation itself."
  1. Biometrics: Biometric technologies• Turn our bodies into data and code• Translates our body data from the physical world to the digital world• Examples? These include:• DNA, Faces, Voices, Eyes (retinas, irises)• Fingerprints and finger geometry, footprints, handprints• Behavioral characteristics (emotions, personality traits, writing, walking Key ethical danger: Our biometrics cannot be reset like a username/password
  2. risk: The existential threats we face today (including those that could annihilate intelligent life on Earth) are due to the vary same innovations that have made our lives more easy, enjoyable, and productive Fossil Fuels, Nuclear WAr, Infectious Diseases, Elimination of large categories of employment
  3. inequality: The benefits of technology remain unevenly distributed, and inve- neon may even widen some of the gaps Life Expectancy, Infant Mortality Rates, Patterns of resource use, Internet access
  4. The meaning and value of (human) nature: Technological innovation upsets continuity Technological developments threaten to destroy cherished landscapes and biologi- cal diversity Challenges to our concepts of a natural way of life Ex) Gene modification, artificial intelligence, robotics, brain-computer interfaces
  5. technological determinism: the theory that technology, once invented, pos- sesses an unstoppable momentum, reshaping society to fit its demands

Study online at https://quizlet.com/_du2v5q Problem with this idea: causes us to see technology as autonomous, removed from social forces

  • Blinds us to the power and intent that create anddesign technology• Strips away responsibility from human creators• Technology is both a site and object of politics• Human values enter into the design of technology• Further downstream, human values continue to shape the ways in which technolo- gies are put to use
  1. technocracy: the idea that technological inventions are managed and controlled by human actors...BUT only those with specialist knowledge and skills canrise to the task• Underlying idea: modern/technological life is too complicated to be managed by 'ordinary people'• Problem: experts can overestimate the degree of certainty behind their positions• Blind themselves to knowledge and criticism coming from outside of their closed ranks• Need for greater transparency and public oversight in tech's "expert" decision-mak- ing
  2. unintended consequences: implies that it is neither possible nor necessary to think ahead about things that can eventually go wrong• Technologies fail (we know this)...but who should be blamed for failures, and under what circumstances?• The more dramatic the failure, the less likely weare to accept that it was imagined, let alone intended, by the object/system's designers• Natural disaster? Act of God?• Key: this thinking absolves known actors of responsibility for what went wrong, and for picking up the pieces•

Study online at https://quizlet.com/_du2v5q

  1. Weapons of Math Destruction: the heart of the booming big data economy Engineered to evaluate large numbers of people Are cheap Operate at a scale to sort, target or optimize millions of people Scale becomes/ focus/justification, allowing imperfection to be ignored For some, a source of political currency, "fixing problems" For others, a source of profit Model is a black box - difficulty to get clarity on decisions "Opaque, unquestioned, unaccountable"
  2. Weapons of Math Destruction: three important characteristics: Opacity: Opaque and invisible (not clearly understood) -transparency matters! Common justification: "intellectual property" Damage: Works against subjects interest, can damage/destroy lives in unfair ways Pernicious feedback loop Ex; prison sentencing models (based on risk of "recidivism") : "sentencing models that profile a person by his other circumstances help to create the environment that justifies their assumptions. The destructive loop goes round and round and the model become more and more unfair (29) Scale: have the capacity to scale exponentially Turns WMDs into forces that define and delimit our lives EX; credit scores...can affect your access to housing, jobs, transportation, etc.
  3. WMD feedback loop: Mathematical models or algorithms that claim to quantify important traits; algorithms producing unexplainable results that impact people's real lives
  4. Moneyball: Using statistics with predictive modeling to build winning teams and organizations
  5. Statistical Rigor: data sets are directly related to performance of players in the game Ex. strikes, balls, hits, home runs, etch
  6. Role of Proxies in WMDs: are assumed to correlate with success EX. SAT scores, how much alumni give financially to their alma mater

Study online at https://quizlet.com/_du2v5q When a model relies on proxies, it becomes easiest to game the system Inflicts widespread damage Generates an endless spiral of destructive feedback loops

  1. Models - what are they, and how are they useful?: Model: an abstract repre- sentation of some process Ex: baseball game, oil company's supply chain, foreign government actions Models take what we know and use it to predict responses in various situations Tell us what to expect Help us to guide our decisions Building models extends power and influence in the world, creating something that others can implement Baseball Models Healthy: rigorous, transparent and continuously updated People being modeled understand the models objective (winning) Family Meal Model If kids questioned underlying assumptions, parents would be happy to explain (economic, dietary, etc.)
  2. Two vital ethical questions for WMD's: o Who designed the model? o What were they trying to accomplish?: Tend to punish the poor, who are processed by machines The wealthy benefit from personal input/evaluation Feedback of WMD's is often money, profit (not people) Money is also the incentive for creating/deploying algorithmic decision making "Systems are designed to gobble up more data and fine-tune their analytics so that more money will pour in. Investors, of course feast on these reuters and shower WMD companies with more money" People become collateral damage

Study online at https://quizlet.com/_du2v5q •The decision-making process is entirely opaque for applicants •We do not know what answers disqualify us from jobs •Model receives very little feedback •Scale: widespread in use, has enormous impact •used to manage large numbers of applications (especially for low-paying jobs) •Save money by replacing HR professionals with machines •Feedback loop: red-lighting people with mental health issues prevents them from having a job and leading a normal life, further isolating them

  1. clopenings: • example of wildly irregular schedules employees face What does this mean for....eating, sleeping, child care, transportation, etc.
  2. scheduling technologies are WMDs: •Used on a massive scale •Takes advantage of people who are struggling to make ends meet •Opaque: workers have no clue when they'll be called to work •Poisonous Feedback loop: haphazard schedules make it difficult for people to get out of low-wage jobs •Going to school •Organizing for better labor conditions •Physical health: lack of sleep and heightened anxiety •Second jobs are difficult to maintain •Children grow up without routines •Often limits workers to less than 30/hrs for week •Not eligible for company benefits, including health insurance
  3. Roots in Operations Research: •Scheduling technology: roots in Operations Research •Grew in importance during WW-II •Post WW-II spread into workforce: science of logistics •"Just in Time" production - revolutionizes supply chain management •Japan: Toyota and Honda •Underpinnings of companies like Amazon and UPS
  4. Predictive Policing: crime prediction software, relying on big data PRedPol: focuses on Geography instead of individuals Very popular in budget-strapped police departments across US Police have a choice when setting up the PredPol system: what to focus on? Part 1 crimes: violent crimes like homicide, arson and assault Part 2 crimes: petty crimes like vagrancy, panhandling, selling small amounts of drug "nuisance" crimes that would often go unnoticed if an officer was not there to witness them Endemic to many impoverished neighborhoods

Study online at https://quizlet.com/_du2v5q Choice to include part 2 crimes: encouraged by idea of "broken windows policing"-- idea that low-level crimes create an "atmosphere of disorder" in a neighborhood

  1. the digitalization of criminal justice reproduce unjust social conditions?: - Digital technologies and its related science have helped extend the Wars on Crime (1960's on) and Drugs (1970's on) Databases systems, mobile devices and wireless networks: allowed correctional supervision, detainment, and racial profiling to quietly expand Authorities appeal to data science to justify the continuation of crime and drug war policies "This future is actually old" IT corporation have become primary drivers of criminal justice digitization Information capital: has turned criminal justice technoscience into a thriving industry The convergence of mass criminalization and smart cities is transforming geogra- phies of carceral power Smart city infrastructure: cell towers, environmental sensors, fiber-optic cables, local area networks, server rooms, smart cameras, etc. Allows cities to administer entire communities in ways that increasingly resemble correctional supervision. To understand racialized governance, we need to confront its socio-technical dimen- sion The digitalization of criminal justice reproduces unjust social conditions Fences and bars are made invisible Racialized surveillance and punishment find their ways onto our personal devices
  2. The State: The state plays a key role in classifying populations and articulating relations between populations Key function of modern nation-state assigning racial classification to groups These classification have long been used by cours, law enforcement, and military personnel to justify race based exclusions Ex. enslaving Africans, Japanese internment immigration criteria, Muslim ban The state is best described as a network of apparatuses

Study online at https://quizlet.com/_du2v5q Generated through spatial statistics/applications Positioned police as "permanent administrators of impoverished areas" 1500 officers for 24 areas, looking for minor offenses (illegal window washing, public urination, homeless encampment, loitering, panhandling, public alcohol consump- tion) Proudly credited by mayor for generating 50,000 arrests over 2 years Continuous tracking/surveillance of targeted individuals Drug possession, property crime, subway fare evasion, prostitution, trespassing Result: low-income communities become inundated with police and aggressively micromanaged through hyperactive tactics Ex) geotargeting people for stop-and-frisk (over 500,000 incidents in impact zones over 2005-2006, over 80% Black or Hispanic)

  1. Meme Wars, Introduction: o Memorable pieces of media that resonate with people for different reasons o Turn people into characters (no longer real persons) o Recontextualized, Remixed, Redistributed o Memes signify membership within an in-group § Ex) Gadsden Flag
  2. Meme Wars, with regards to how culture wars have been impacted by the internet: how religious conflicts in America's diverse population evolved into the polarized traditional-versus-progressive political dichotomy Evangelicals and radio personalities (Rush Limbaugh): spiritual/existential enemy was liberalism Andrew Breitbart: "politics is downstream of culture" Idea: if you can shape the culture, you can shape politics "Social media did to the culture wars what spinach did to Popeye - it juiced them up"
  3. From Social Networking to Social Media: Social Networking: business model to connect people and litter pages with ads Social Media: business model connects people to other people and ad-laden "con- tent" Change: a digital economy built on engagement

Study online at https://quizlet.com/_du2v5q Content farms and clickbait mimic the tone and style of news sites, but whose real intention is to make money off of advertising Clickbait users in a new era of "fake news" and disinformation personalized information ecosystems: algorithmic echo chambers shaped individual news feeds and timelines Result: Polarization! Ex) Washington Post's "Red Feed and Blue Feed":

  1. rabbit holes: metaphor for a winding pathway to an unusual or unsettling environment (Alice in Wonderland) Applied here: memes can entice us down rabbit holes, following them from website to website, particularly ones that are disturbing or taboo Rabbit holes are built into the design of social media, which gives incredible power to harness the Four "R's" of media manipulation...
  2. Responsiveness (reactions) and Reinforcement (algorithmic): Responsive- ness: reactions that posts evoke Ex) comments, hearts, retweets....ways of signaling excitement Social media: the only mass media form with constant interaction on a range of topics Responding brings communities together in replies Reinforcement: conferred on a topic by algorithms, which personalize experiences for users and promote similar types of content in recommendation and search returns Turf wars: creates objective to game the algorithm to surface your ideas (ex: what comes up when you search "QAnon," "Trump," "Joe Biden"?)
  3. From the wire to the weeds: this design makes social media the perfect mechanism for moving fringe ideas into the mainstream over time From the Wire to the Weeds: -Someone makes an appeal online --This leads to a real-world event where violence or a spectacle breaks out ---This event leads to media coverage and action online ----Leads to a new real-world event, causing violence or spectacle -----Leads to media attention ------Leads to online discussion or planning .....

Study online at https://quizlet.com/_du2v5q purpose and material/social relations of the thing itself Obscures systemic relations of power and closes off informed public discussion of whom AI systems serve and their wider planetary consequences Promotes fantasy that "AI systems are disembodied brains that absorb and produce knowledge independently from their creators, infrastructures, and the world at large"

  1. Four ways that AI is an extractive industry: Mining: tech sector's vast appetite for rare earth materials Energy: fossil fuels, access to land/resources for data centers Software: building models for natural language processing and computer vision consume enormous amounts of energy Infrastructure: public utilities (gas mains, sewer pipes, high-voltage lines and dis- count electricity) The artificial intelligence industry has been heavily subsidized: Defense funding, Federal research agencies, Tax breaks, Public utilities, Data and unpaid labor taken from all who use search engines and post images online Human labor global workforce, often paid pennies on the dollar, performing microtasks so that data systems can seem more intelligent than they are From miners extracting tin in Indonesia, to crowdworkers in India, to iPhone factory workers in China Tech companies: shadow workforce of contract laborers outnumber full-time employ- ees, but with fewer benefits and no job security AI systems increase surveillance and control in workplaces Ex) Amazon warehouses: apps used to track workers, nudge them to work longer hours, rank them in real time Data: All publicly accessible digital material is open to being harvested for training datasets that produce AI models People's personal materials become infrastructure: used to improve algorithms that perform functions like facial recognition, language prediction and object detection Ethical issues of privacy and "surveillance capitalism" Data is used in practices of classification - AI systems use labels to predict human identity and qualities Essentialized gender and race categories

Study online at https://quizlet.com/_du2v5q Problematic assessments of character and credit worthiness Reliance on Proxies, datasets as political interventions Machine learning becomes a powerful governing force

  1. AI as both embodied and material: AI as we know it depends entirely on a much wider set of political and social structures: Natural resources Human labor and human data Human histories and classifications • Infrastructures and logistics Capital
  2. AI ethics: Rarely enforceable: self-regulation allows companies to choose how to deploy technologies Companies rarely suffer financial penalties when AI systems violate laws and/or their own ethical principles Focus on ethical ends rather than means of AI's application Rarely accountable to a broader public Primarily accountable to shareholders: maximize ROI over ethical concerns
  3. two problems with current AI ethics principles: "many social institutions are now influenced by these tools and methods, which shape what they value and how decisions are made while creating a complex series of downstream effects." (20) Key question: who decides what ethical AI means for the rest of the world? ->brings us to questions of power and inequality: Whose interests does it serve? Who bears the greatest risk of harm? How can I serve the commitment to a more just and sustainable world? Where do these technologies serve that vision? Where should AI not be used? Where does it undermine justice?
  4. What is Ghost Work in the tech sector today?: an opaque world of employ- ment, often intentionally hidden ("a shadow workforce") Humans behind the seemingly automated systems that we all take for granted Human labor that powers many of our mobile phone apps, websites, and AI systems "when the AI trips up or can't finish the job, thousands of businesses call on people to quietly complete the project"

Study online at https://quizlet.com/_du2v5q Human bosses replaced by automated processes programmed to oversee an anonymous, far-flung workforce

  1. AI Ethics and the future of labor: Who benefits from the veneer of automation, and who might be harmed? Algorithmic cruelty: the thoughtless processing of human labor though computation that is incapable of thought, let alone empathy "Just as we need companies to be accountable for the labor practices that produce our food, clothes, and computers, so should the producers of digital content be accountable to their consumers and workers" (xxxi) A call for transparency: "we should demand truth in advertising in cases where humans have been brought in to benefit us"
  2. Ethical responses to WMDs, from O'Neil:: o Impose human ethics on these systems, even at the cost of efficiency o Enforce and update laws that already exist Key: we need to impose human ethics on these systems, even at the cost of efficiency • Ex) a model might be programmed to make sure that various ethnicities or income levels are represented within groups of voters or consumer
  • Ex) a model could highlight cases in which people in certain zip codes pay twice the average for certain services
  • Mathematical models should be our tools, not our masters. Need to enforce and update laws that already exist• Fair Credit Reporting Act (FCRA) and the Equal Credit Opportunity Act (ECOA): both were meant to ensure fairness in credit scoring
  • Americans with Disabilities Act (ADA)• Currently prohibits medical exams as part of an employment screening. But needs to account for Big Data personality tests, health scores, and reputation scores
  • Health Insurance Portability and Accountability Act (HIPAA): expand to cover the medical data collected by employers, health apps, and other Big Data companies
  • Ex) Google searches for medical treatments
  1. Transparency: o Install a regulatory system that accounts for hidden costs of WMDs § Non-numerical values -Ex) predatory lenders, targeting people looking for government assistance

Study online at https://quizlet.com/_du2v5q

  1. Getting a grip on techno-utopia: Where should algorithmic decision-making not be used? That unbounded and unwarranted hope in what algorithms and technology can accomplish. Before asking them to do better, we have to admit they can't do everything."
  2. Auditing WMDs: o disarm WMDs, we also need to measure their impact and conduct algorithmic audits
  • Ex) Value-added models assessing teacher effectiveness• Movements towards auditing algorithms are underway at leading universities Ex) software robots masquerade as people online from all stripes - a way to see the treatment they receive and detect biases in automated systems
  • from search engines to job placement sites to police the WMDs, we need people with the skills to build them
  1. Crowdsourcing: allows people across society to report on the messaging they are receiving
  • illuminate the practices and strategies of microtargeting campaigns
  1. Hippocratic Oath: Four key principles of medical ethics:
  2. Autonomy: respect individuals right to make their own choices and have their own life plan
  • Informed consent
  1. Non-maleficence: do not inflict harm intentionally
  2. Beneficence: act for the benefit of others
  3. Justice: distribute health resources fairly O'Neil: "like doctors, data scientists should pledge a Hippocratic Oath, one that focuses on the possible misuses and misinterpretations of their models." (174) Hare: "inviting ourselves to think about the values we want to guide the creation and use of tools and technology is one step that we can all take to create a culture of technology ethics" (205)
  4. Key ethics-related questions from Hare:: inviting ourselves to think about the values we want to guide the creation and use of tools and technology is one step that we can all take to create a culture of technology ethics" Technology ethics is increasingly on the curriculum at leading universities...Har- vard, Stanford, Cornell, MIT, IU, UT-Austin, NYU, Wisconsin, U of Washington,