Embedding Ethics in Computer Science

This site provides access to curricular materials created by the Embedded Ethics team at Stanford for undergraduate computer science courses. The materials are designed to expose students to ethical issues that are relevant to the technical content of CS courses, to provide students structured opportunities to engage in ethical reflection through lectures, problem sets, and assignments, and to build the ethical character and habits of young computer scientists.

banner logo

Modules (Assignments + Lectures)

Banking on security.

Course: Intro to Systems

This assignment is about assembly, reverse engineering, security, privacy and trust. An earlier version of the assignment by Randal Bryant & David O'Hallaron (CMU), [accessible here](http://csapp.cs.cmu.edu/public/labs.html), used the framing story that students were defusing a ‘bomb’.

Bits, Bytes, and Overflows

The assignment is the first in an introduction to systems course. It covers bits, bytes and overflow, continuing students’ introduction to bitwise and arithmetic operations. Following the saturating arithmetic problem, we added a case study analysis about the Ariane-5 rocket launch failure. This provided students with a vivid illustration of the potential consequences of overflows as well as an opportunity to reflect on their responsibilities as engineers. The starter code is the full project provided to students.

Climate Change & Calculating Risk

Course: Probability

This assignment uses the tools of probability theory to introduce students to _risk weighted expected utility_ models of decision making. The risk weighted expected utility framework is then used to understand decision-making under uncertainty in the context of climate change. Which of the IPCC’s forecasts should we use? Do we owe it to future people to adopt a conservative risk profile when making decisions on their behalf? The assignment also introduces normative principles for allocating responsibility for addressing climate change. Students apply these formal tools and frameworks to understanding the ethical dimensions of climate change.

Fairness, Representation, and Machine Learning

This assignment builds on introductory knowledge of machine learning techniques, namely the naïve Bayes algorithm and logistic regression, to introduce concepts and definitions of algorithmic fairness. Students analyze sources of bias in algorithmic systems, then learn formal definitions of algorithmic fairness such as independence, separation, and fairness through awareness or unawareness. They are also introduced to notions of fairness that complicate the formal paradigms, including intersectionality and subgroup analysis, representation, and justice beyond distribution.

Lab: Therac-25 Case Study

This lab, the last of the course, asks students to discuss the case of Therac-25, a medical device that delivered lethal radiation due to a race condition.

Responsible Disclosure & Partiality

This assignment is about void * and generics. We added a case study about responsible disclosure and partiality. Students read a summary of researcher Dan Kaminsky’s discovery of a DNS vulnerability and answer questions about his decisions regarding disclosure of vulnerabilities as well as their own thoughts on partiality. The starter code is the full project provided to students.

Responsible Documentation

When functions have assumptions, limitations or flaws, it is vital that the documentation makes those clear. Without documentation, developers don’t have the information they need to make good decisions when writing their programs. We added a documentation component to this C string assignment. Students write a manual page for the skan_token function they have implemented, learning responsible documentation practice as they go. The starter code is the full project provided to students.

Design Discovery and Needfinding

Course: Intro to HCI

The lecture covers topics associated with power relations, the use of language, standpoint and inclusion as they arise in the context of design discovery.

Values in Design

The lecture presents the concept of values in design. It introduced the distinction between intended and collateral values; discusses the importance of assumptions in the value encoding process; and presents three strategies to address value conflicts that arise as a result of design decisions.

Assignments

Concept video.

This assignment asks students to consider what values are encoded in their product and the decisions they make in the design process; whether there are conflicting values; and how they address existing value conflicts.

Ethics in Advanced Technology

Course: AI Principles

After successfully creating a component of a self-driving car – a (virtual) sensor system that tracks other surrounding cars based on noisy sensor readings – students are prompted to reflect on ethical issues related to the creation, deployment, and policy governance of advanced technologies like self-driving cars. Students encounter classic concerns in the ethics of technology such as surveillance, ethics dumping, and dual-use technologies, and apply these concepts to the case of self-driving cars.

Foundations: Code of Ethics

In Problem 3 of this assignment, “Ethical Issue Spotting,” students explore the ethics of four different real-world scenarios using the ethics guidelines produced by a machine learning research venue, the NeurIPS conference. Students write a potential negative social impacts statement for each scenario, determining if the algorithm violates one of the sixteen guidelines listed in the NeurIPS Ethical Guidelines. In doing so, they practice spotting potential ethical concerns in real-world applications of AI and begin taking on the role of a responsible AI practitioner.

Heuristic Evaluation

This assignment asks students to evaluate their peers’ projects through a series of heuristics and to respond to others’ evaluations of their projects. By incorporating ethics questions to this evaluation, we prompt them to consider ethical aspects as part of a product’s design features which should be evaluated alongside other design aspects.

Medium-Fi Prototype

Modeling sea level rise.

This assignment is about Markov Decision Processes (MDPs). In Problem 5, we use the MDP the students have created to model how a coastal city government’s mitigation choices will affect its ability to adapt to rising sea levels over the course of multiple decades. At each timestep, the government may choose to invest in infrastructure or save its surplus budget. But the amount that the sea will rise is uncertain: each choice is a risk. Students model the city’s decision-making under two different time horizons, 40 or 100 years, and with different discount factors for the well-being of future people. In both cases, they see that choosing a longer time horizon or a smaller discount factor will lead to more investment now. Students then are introduced to five ethical positions on the comparative value of current and future generations’ well being. They evaluate their modeling choices in light of their choice of ethical position.

Needfinding

With this assignment, students will reflect on the group of users their project is intended to serve; their reasons for selecting these users; the notion of an “extreme user;” and the reason why their perspectives are valuable for the design process. I also asks them to reflect on what accommodations they make for their interviewees.

POV and Experience Prototypes

With this assignment, students are prompted to reflect on how proposed solutions to the problems they identify may exclude members of certain communities.

Residency Hours Scheduling

In this assignment, students explore constraint satisfaction problems (CSP) and use backtracking search to solve them. Many uses of constraint satisfaction in real-world scenarios involve assignment of resources to entities, like assigning packages to different trucks to optimize delivery. However, when the agents are people, the issue of fair division arises. In this question, students will consider the ethics of what constraints to remove in a CSP when the CSP is unsatisfiable.

Sentiment Classification and Maximum Group Loss

Although each of the problems in the problem set build on one another, the ethics assignment itself begins with Problem 4: Toxicity Classification and Maximum Group Loss. Toxicity classifiers are designed to assist in moderating online forums by predicting whether an online comment is toxic or not so that comments predicted to be toxic can be flagged for humans to review. Unfortunately, such models have been observed to be biased: non-toxic comments mentioning demographic identities often get misclassified as toxic (e.g., “I am a [demographic identity]”). These biases arise because toxic comments often mention and attack demographic identities, and as a result, models learn to _spuriously correlate_ toxicity with the mention of these identities. Therefore, some groups are more likely to have comments incorrectly flagged for review: their group-level loss is higher than other groups.

We could't find any materials that match that combination of CS and ethics topics.

Artificial Intelligence: Principles and Techniques

Artificial intelligence (AI) has had a huge impact in many areas, including medical diagnosis, speech recognition, robotics, web search, advertising, and scheduling. This course focuses on the foundational concepts that drive these applications. In short, AI is the mathematics of making good decisions given incomplete information (hence the need for probability) and limited computation (hence the need for algorithms). Specific topics include search, constraint satisfaction, game playing,n Markov decision processes, graphical models, machine learning, and logic.

Introduction to Computer Organization & Systems

Introduction to the fundamental concepts of computer systems. Explores how computer systems execute programs and manipulate data, working from the C programming language down to the microprocessor.

Introduction to Human-Computer Interaction

Introduces fundamental methods and principles for designing, implementing, and evaluating user interfaces. Topics: user-centered design, rapid prototyping, experimentation, direct manipulation, cognitive principles, visual design, social software, software tools. Learn by doing: work with a team on a quarter-long design project, supported by lectures, readings, and studios.

Probability for Computer Scientists

Introduction to topics in probability including counting and combinatorics, random variables, conditional probability, independence, distributions, expectation, point estimation, and limit theorems. Applications of probability in computer science including machine learning and the use of probability in the analysis of algorithms.

Programming Methodology

Introduction to the engineering of computer applications emphasizing modern software engineering principles: object-oriented design, decomposition, encapsulation, abstraction, and testing. Emphasis is on good programming style and the built-in facilities of respective languages. No prior programming experience required.

Programming Abstractions

Abstraction and its relation to programming. Software engineering principles of data abstraction and modularity. Object-oriented programming, fundamental data structures (such as stacks, queues, sets) and data-directed design. Recursion and recursive data structures (linked lists, trees, graphs). Introduction to time and space complexity analysis. Uses the programming language C++ covering its basic facilities.

Reinforcement Learning

To realize the dreams and impact of AI requires autonomous systems that learn to make good decisions. Reinforcement learning is one powerful paradigm for doing so, and it is relevant to an enormous range of tasks, including robotics, game playing, consumer modeling and healthcare. This class will provide a solid introduction to the field of reinforcement learning and students will learn about the core challenges and approaches, including generalization and exploration. Through a combination of lectures, and written and coding assignments, students will become well versed in key ideas and techniques for RL. Assignments will include the basics of reinforcement learning as well as deep reinforcement learning — an extremely promising new area that combines deep learning techniques with reinforcement learning.

Operating Systems Principles

This class introduces the basic facilities provided by modern operating systems. The course divides into three major sections. The first part of the course discusses concurrency: how to manage multiple tasks that execute at the same time and share resources. Topics in this section include processes and threads, context switching, synchronization, scheduling, and deadlock. The second part of the course addresses the problem of memory management; it will cover topics such as linking, dynamic memory allocation, dynamic address translation, virtual memory, and demand paging. The third major part of the course concerns file systems, including topics such as storage devices, disk management and scheduling, directories, protection, and crash recovery. After these three major topics, the class will conclude with a few smaller topics such as virtual machines.

Design and Analysis of Algorithms

Worst and average case analysis. Recurrences and asymptotics. Efficient algorithms for sorting, searching, and selection. Data structures: binary search trees, heaps, hash tables. Algorithm design techniques: divide-and-conquer, dynamic programming, greedy algorithms, amortized analysis, randomization. Algorithms for fundamental graph problems: minimum-cost spanning tree, connected components, topological sort, and shortest paths. Possible additional topics: network flow, string searching.

Design for Behavior Change

Over the last decade, tech companies have invested in shaping user behavior, sometimes for altruistic reasons like helping people change bad habits into good ones, and sometimes for financial reasons such as increasing engagement. In this project-based hands-on course, students explore the design of systems, information and interface for human use. We will model the flow of interactions, data and context, and crafting a design that is useful, appropriate and robust. Students will design and prototype utility apps or games as a response to the challenges presented. We will also examine the ethical consequences of design decisions and explore current issues arising from unintended consequences. Prerequisite: CS147 or equivalent.

A new initiative seeks to integrate ethical thinking into computing

programming ethics assignment

Technology is facing a bit of a reckoning.

Algorithms impact free speech, privacy, and autonomy. They, or the datasets on which they are trained, are often infused with bias or used to inappropriately manipulate people. And many technology companies are facing pushback against their immense power to impact the wellbeing of individuals and democratic institutions. Policymakers clearly need to address these problems. But universities also have an important role to play in preparing the next generation of computer scientists, says  Mehran Sahami , professor and associate chair for education in the Computer Science department at Stanford University. “Computer scientists need to think about ethical issues from the outset rather than just building technology and letting problems surface downstream.”

To that end, the  Stanford Computer Science department , the  McCoy Family Center for Ethics in Society  and the  Institute for Human-Centered Artificial Intelligence  (HAI) are jointly launching an initiative to create ethics-based curriculum modules that will be embedded in the university’s core undergraduate computer science courses. Called Embedded EthiCS (the uppercase CS stands for computer science), the program is being developed in collaboration with a network of researchers who launched a similar program at Harvard University in 2017.

“Embedded EthiCS will allow us to revisit different ethical topics throughout the curriculum and have students get a better appreciation that these issues come up in a more constant and consistent manner, rather than just being addressed on the side or after the fact,” Sahami says.

Once the modules have been successfully implemented at Stanford, they will be disseminated online (under a Creative Commons license) and available for other universities to use or adapt as a part of their own core undergraduate computer science courses. “We hope, through this initiative, to make an engagement with ethical questions inescapable for people majoring in computer science everywhere,” says  Rob Reich , professor of political science in the School of Humanities and Sciences, director of the McCoy Family Center for Ethics in Society, and associate director of Stanford HAI.

Expanding the Curriculum

Teaching ethics to Stanford undergraduate computer science students is not new. Individual courses have been around for more than 20 years, and a new interdisciplinary Ethics and Technology course was launched three years ago by Reich, Sahami, professor of political science in the School of Humanities and Sciences  Jeremy Weinstein , and other collaborators. But the Embedded EthiCS initiative will ensure that more students understand the importance of ethics in a technological context, Sahami says. And it signals to students that ethics is absolutely integral to their computer science education.

The initiative, which is funded by a member of the HAI advisory board, has already taken its first step: hiring Embedded EthiCS fellow  Kathleen Creel . She will collaborate with computer science faculty to develop ethics modules that will be integrated into core undergraduate computer science courses during the next two years.

Creel, who says she feels as if she’s been training for this job her whole life, double majored in computer science and philosophy as an undergraduate before working in tech and then getting her PhD in the history and philosophy of science.

“Studying computer science changed the way I think about everything,” Creel says. She remembers being delighted by the way her mindset shifted as she learned how to formulate problems, define variables, and create optimization algorithms. She also realized (with help from her philosophy coursework) that each of those steps raised ethical questions. For example: For whom is this a problem? Who benefits from the solution to this problem? How does the formulation of this problem have ethical consequences? What am I trying to optimize?

“One of the hopes behind the Embedded EthiCS curriculum is that as you’re learning this whole computational mindset that will change your life and the way you think about everything, you’ll also practice, throughout the whole curriculum, building ethical thinking into that mindset.”

‘Spaces to Think’

The Embedded EthiCS modules created by Creel and her collaborators will be deployed in one class during the fall quarter of 2020, and two classes in each of the Winter and Spring quarters of 2021. Each module will include at least one lecture and one assignment that grapples with ethical issues relevant to the course. But Creel says she and her collaborators are also working on ways to more deeply embed the modules — so that they aren’t just stand-alone days.

Topics covered will vary depending on the course, but will include fairness and bias in machine learning algorithms, the manipulation of digital images, and other issues of interpersonal ethics in technology, such as how a self-driving car should behave in order to preserve human life or minimize suffering. Creel says modules will also address how technology should function in a democratic society, as well as “meta-ethical” issues such as how a person might balance duties as a software engineer for a particular company with duties as a moral agent more generally. “Students often want very much to do the right thing and want opportunities and spaces to think about how to do it,” Creel says.

The goal, says  Anne Newman , research director at the McCoy Family Center for Ethics in Society, is “for students to gain the skills to be good reasoners about ethical dilemmas, and to understand what the competing values are — that there are value tensions and how to muddle through those.”

As Reich sees it, “We want the pipeline of first-rate computer scientists coming out of Stanford to have a full complement of ethical frameworks to accompany their technical prowess.” At the same time, he hopes that the many students at Stanford who take intro computer science courses but don’t major in the field will also benefit from understanding the ethical, social, and political implications of technology — whether as informed citizens, consumers, policy experts, researchers, or civil society leaders. “We won’t create overnight a new landscape for the governance or regulation of technology or professional ethics for computer scientists or technologists, but rather by educating the next generation,” he says.

Mehran Sahami , professor and associate chair for education in the Computer Science department

Vibrant, colorful illustration of a human brain.

The future of brain science

Researchers adjusting holographic augmented reality display.

AI and holography bring 3D augmented reality to regular glasses

Map of the United States with binary code in the background.

The future of cybersecurity

How a new program at Stanford is embedding ethics into computer science

Shortly after Kathleen Creel started her position at Stanford as the inaugural Embedded EthiCS fellow some two years ago, a colleague sent her a 1989 newspaper clipping about the launch of Stanford’s first computer ethics course to show her how the university has long been committed to what Creel was tasked with: helping Stanford students understand the moral and ethical dimensions of technology .

programming ethics assignment

Kathleen Creel is training the next generation of entrepreneurs and engineers to identify and work through various ethical and moral problems they will encounter in their careers. (Image credit: Courtesy Kathleen Creel)

While much has changed since the article was first published in the San Jose Mercury News , many of the issues that reporter Tom Philp discussed with renowned Stanford computer scientist Terry Winograd in the article remain relevant.

Describing some of the topics Stanford students were going to deliberate in Winograd’s course – a period Philp described as “rapidly changing” – he wrote: “Should students freely share copyrighted software? Should they be concerned if their work has military applications? Should they submit a project on deadline if they are concerned that potential bugs could ruin peoples’ work?”

Three decades later, Winograd’s course on computer ethics has evolved , but now it is joined by a host of other efforts to expand ethics curricula at Stanford. Indeed, one of the main themes of the university’s Long Range Vision is embedding ethics across research and education. In 2020, the university launched the Ethics, Society, and Technology (EST) Hub , whose goal is to help ensure that technological advances born at Stanford address the full range of ethical and societal implications.

That same year, the EST Hub, in collaboration with Stanford Institute for Human-Centered Artificial Intelligence (HAI), the McCoy Family Center for Ethics in Society , and the Computer Science Department, created the Embedded EthiCS program, which will embed ethics modules into core computer science courses. Creel is Embedded EthiCS’ first fellow.

Stanford University, situated in the heart of Silicon Valley and intertwined with the influence and impact inspired by technological innovations in the region and beyond, is a vital place for future engineers and technologists to think through their societal responsibilities, Creel said.

“I think teaching ethics specifically at Stanford is very important because many Stanford students go on to be very influential in the world of tech,” said Creel, whose own research explores the moral, political, and epistemic implications of how machine learning is used in the world.

“If we can make any difference in the culture of tech, Stanford is a good place to be doing it,” she said.

Establishing an ethical mindset

Creel is both a computer scientist and a philosopher. After double-majoring in both fields at Williams College in Massachusetts, she worked as a software engineer at MIT Lincoln Laboratory on a large-scale satellite project. There, she found herself asking profound, philosophical questions about the dependence on technology in high-stake situations, particularly when it comes to how AI-based systems have evolved to inform people’s decision-making. She wondered, how do people know they can trust these tools and what information do they need to have in order to believe that it can be a reliable addition or substitution for human judgment?

Creel decided to confront these questions head-on at graduate school, and in 2020, she earned her PhD in history and the philosophy of science at the University of Pittsburgh.

During her time at Stanford, Creel has collaborated with faculty and lecturers across Stanford’s Computer Science department to identify various opportunities for students to think through the social consequences of technology – even if it’s just one or five minutes at a time.

Rather than have ethics be its own standalone seminar or dedicated class topic that is often presented at either the beginning or end of a course, the Embedded EthiCS program aims to intersperse ethics throughout the quarter by integrating it into core course assignments, class discussions, and lectures.

“The objective is to weave ethics into the curriculum organically so that it feels like a natural part of their practice,” said Creel. Creel has worked with professors on nine computer science courses, including: CS106A: Programming Methodology ; CS106B: Programming Abstractions ; CS107: Computer Organization and Systems ; CS109: Introduction to Probability for Computer Scientists ; CS221: Artificial Intelligence: Principles and Techniques ; CS161: Design and Analysis of Algorithms; and CS47B: Design for Behavior Change.

During her fellowship, Creel gave engaging lectures about specific ethical issues and worked with professors to develop new coursework that demonstrates how the choices students will make as engineers carry broader implications for society.

One of the instructors Creel worked with was Nick Troccoli , a lecturer in the Computer Science Department. Troccoli teaches CS 107: Computer Organization & Systems , the third course in Stanford’s introductory programming sequence, which focuses mostly on how computer systems execute programs. Although some initially wondered how ethics would fit into such a technical curriculum, Creel and Troccoli, along with course assistant Brynne Hurst, found clear hooks for ethics discussions in assignments, lectures, and labs throughout the course.

For example, they refreshed a classic assignment about how to figure out a program’s behavior without seeing its code (“reverse engineering”). Students were asked to imagine they were security researchers hired by a bank to discover how a data breach had occurred, and how the hacked information could be combined with other publicly-available information to discover bank customers’ secrets.

Creel talked about how anonymized datasets can be reverse engineered to reveal identifying information and why that is a problem. She introduced the students to different models of privacy, including differential privacy, a technique that can make privacy in a database more robust by minimizing identifiable information.

Students were then tasked to provide recommendations to further anonymize or obfuscate data to avoid breaches.

“Katie helped students understand what potential scenarios may arise as a result of programming and how ethics can be a tool to allow you to better understand those kinds of issues,” Troccoli said.

Another instructor Creel worked with was Assistant Professor Aviad Rubinstein , who teaches CS161: Design and Analysis of Algorithms .

Creel and Rubinstein, joined by research assistant Ananya Karthik and course assistant Golrokh Emami, came up with an assignment where students were asked to create an algorithm that would help a popular distributor decide the locations of their warehouses and determine which customers received one versus two-day delivery.

Students worked through the many variables to determine warehouse location, such as optimizing cost with existing customer demand and driver route efficiency. If the algorithm prioritized these features, closer examination would reveal that historically redlined Black American neighborhoods would be excluded from receiving one-day delivery.

Students were then asked to develop another algorithm that would address the delivery issue while also optimizing even coverage and cost.

The goal of the exercise was to show students that as engineers, they are also decision-makers whose choices carry real-world consequences that can affect equity and inclusion in communities across the country. Students were asked to also share what those concepts mean to them.

“The hope is to show them this is a problem they might genuinely face and that they might use algorithms to solve, and that ethics will guide them in making this choice,” Creel said. “Using the tools that we’ve taught them in the ethics curriculum, they will now be able to understand that choosing an algorithm is indeed a moral choice that they are making, not only a technical one.”

Developing moral co urage

Some students have shared with Creel how they themselves have been subject to algorithmic biases.

For example, when the pandemic shuttered high schools across the country, some school districts turned to online proctoring services to help them deliver exams remotely. These services automated the supervision of students and their space while they take a test.

However, these AI-driven services have come under criticism, particularly around issues concerning privacy and racial bias. For example, the scanning software sometimes fails to detect students with darker skin, Creel said.

Sometimes, there are just glitches in the computer system and the AI will flag a student even though no offense has taken place. But because of the proprietary nature of the technology, how the algorithm came to its decision is not always entirely apparent.

“Students really understand how if these services were more transparent, they could have pointed to something that could prove why an automated flag that may have gone up was wrong,” said Creel.

Overall, Creel said, students have been eager to develop the skillset to help them discuss and deliberate on the ethical dilemmas they could encounter in their professional careers.

“I think they are very aware that they, as young engineers, could be in a situation where someone above them asks them to do something that they don’t think is right,” she added. “They want tools to figure out what is right, and I think they also want help building the moral courage to figure out how to say no and to interact in an environment where they may not have a lot of power. For many of them, it feels very important and existential.”

Creel is now transitioning from her role at Stanford to Northeastern University where she will hold a joint appointment as an assistant professor of philosophy and computer science.

Building an Ethical Computational Mindset

Stanford launches an embedded EthiCS program to help students consistently think through the common issues that arise in computer science. 

students work on computers in a lecture hall

Linda A. Cicero

Stanford's embedded ethics program will  ensure that more students understand the importance of ethics in a technological context and signal that ethics is integral to their work. 

Technology is facing a bit of a reckoning. Algorithms impact free speech, privacy, and autonomy. They, or the datasets on which they are trained, are often infused with bias or used to inappropriately manipulate people. And many technology companies are facing pushback against their immense power to impact the wellbeing of individuals and democratic institutions. Policymakers clearly need to address these problems. But universities also have an important role to play in preparing the next generation of computer scientists, says Mehran Sahami , professor and associate chair for education in the Computer Science department at Stanford University . “Computer scientists need to think about ethical issues from the outset rather than just building technology and letting problems surface downstream.” 

To that end, the Stanford Computer Science department , the McCoy Family Center for Ethics in Society and the Institute for Human-Centered Artificial Intelligence (HAI) are jointly launching an initiative to create ethics-based curriculum modules that will be embedded in the university’s core undergraduate computer science courses. Called Embedded EthiCS (the uppercase CS stands for computer science), the program is being developed in collaboration with a network of researchers who launched a similar program at Harvard University in 2017. 

“Embedded EthiCS will allow us to revisit different ethical topics throughout the curriculum and have students get a better appreciation that these issues come up in a more constant and consistent manner, rather than just being addressed on the side or after the fact,” Sahami says.  

Once the modules have been successfully implemented at Stanford, they will be disseminated online (under a Creative Commons license) and available for other universities to use or adapt as a part of their own core undergraduate computer science courses. “We hope, through this initiative, to make an engagement with ethical questions inescapable for people majoring in computer science everywhere,” says Rob Reich , professor of political science in the School of Humanities and Sciences , director of the McCoy Family Center for Ethics in Society, and associate director of Stanford HAI. 

Expanding the Curriculum

Teaching ethics to Stanford undergraduate computer science students is not new. Individual courses have been around for more than 20 years, and a new interdisciplinary Ethics and Technology course was launched three years ago by Reich, Sahami, professor of political science  in the School of Humanities and Sciences   Jeremy Weinstein , and other collaborators. But the Embedded EthiCS initiative will ensure that more students understand the importance of ethics in a technological context, Sahami says. And it signals to students that ethics is absolutely integral to their computer science education.   

The initiative, which is funded by a member of the HAI advisory board, has already taken its first step: hiring Embedded EthiCS fellow Kathleen Creel . She will collaborate with computer science faculty to develop ethics modules that will be integrated into core undergraduate computer science courses during the next two years. 

Creel, who says she feels as if she’s been training for this job her whole life, double majored in computer science and philosophy as an undergraduate before working in tech and then getting her PhD in the history and philosophy of science. 

“Studying computer science changed the way I think about everything,” Creel says. She remembers being delighted by the way her mindset shifted as she learned how to formulate problems, define variables, and create optimization algorithms. She also realized (with help from her philosophy coursework) that each of those steps raised ethical questions. For example: For whom is this a problem? Who benefits from the solution to this problem? How does the formulation of this problem have ethical consequences? What am I trying to optimize?

“One of the hopes behind the Embedded EthiCS curriculum is that as you’re learning this whole computational mindset that will change your life and the way you think about everything, you’ll also practice, throughout the whole curriculum, building ethical thinking into that mindset.”

‘Spaces to Think’

The Embedded EthiCS modules created by Creel and her collaborators will be deployed in one class during the fall quarter of 2020, and two classes in each of the Winter and Spring quarters of 2021. Each module will include at least one lecture and one assignment that grapples with ethical issues relevant to the course. But Creel says she and her collaborators are also working on ways to more deeply embed the modules – so that they aren’t just stand-alone days. 

Topics covered will vary depending on the course, but will include fairness and bias in machine learning algorithms, the manipulation of digital images, and other issues of interpersonal ethics in technology, such as how a self-driving car should behave in order to preserve human life or minimize suffering. Creel says modules will also address how technology should function in a democratic society, as well as “meta-ethical” issues such as how a person might balance duties as a software engineer for a particular company with duties as a moral agent more generally. “Students often want very much to do the right thing and want opportunities and spaces to think about how to do it,” Creel says. 

The goal, says Anne Newman , research director at the McCoy Family Center for Ethics in Society, is “for students to gain the skills to be good reasoners about ethical dilemmas, and to understand what the competing values are – that there are value tensions and how to muddle through those.”

As Reich sees it, “We want the pipeline of first-rate computer scientists coming out of Stanford to have a full complement of ethical frameworks to accompany their technical prowess.” At the same time, he hopes that the many students at Stanford who take intro computer science courses but don’t major in the field will also benefit from understanding the ethical, social, and political implications of technology – whether as informed citizens, consumers, policy experts, researchers, or civil society leaders. “We won’t create overnight a new landscape for the governance or regulation of technology or professional ethics for computer scientists or technologists, but rather by educating the next generation,” he says. 

Stanford HAI's mission is to advance AI research, education, policy and practice to improve the human condition.  Learn more . 

More News Topics

Related content.

programming ethics assignment

David Magnus: How will artificial intelligence impact medical ethics?

In recent years, the explosion of artificial intelligence in medicine has yielded an increase in hope for patient...

programming ethics assignment

What is the most effective way to bring AI into the classroom?

Two students work on a laptop together.

How An AI-based “Super Teaching Assistant” Could Revolutionize Learning

Stanford researchers propose a system that can better understand student needs and broaden STEM education.

Featured Topics

Featured series.

A series of random questions answered by Harvard experts.

Explore the Gazette

Read the latest.

Six layers of excitatory neurons color-coded by depth.

Epic science inside a cubic millimeter of brain

Panelists Melissa Dell, Alex Csiszar, and Latanya Sweeney at a Harvard symposium on artificial intelligence.

What is ‘original scholarship’ in the age of AI?

Joonho Lee (top left), Rita Hamad, Fei Chen, Miaki Ishii, Jeeyun Chung, Suyang Xu, Stephanie Pierce, and Jarad Mason.

Complex questions, innovative approaches

Embedding ethics in computer science curriculum.

Photo illustration by Judy Blomquist/Harvard Staff

Paul Karoff

SEAS Communications

Harvard initiative seen as a national model

Barbara Grosz has a fantasy that every time a computer scientist logs on to write an algorithm or build a system, a message will flash across the screen that asks, “Have you thought about the ethical implications of what you’re doing?”

Until that day arrives, Grosz, the Higgins Professor of Natural Sciences at the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS), is working to instill in the next generation of computer scientists a mindset that considers the societal impact of their work, and the ethical reasoning and communications skills to do so.

“Ethics permeates the design of almost every computer system or algorithm that’s going out in the world,” Grosz said. “We want to educate our students to think not only about what systems they could build, but whether they should build those systems and how they should design those systems.”

At a time when computer science departments around the country are grappling with how to turn out graduates who understand ethics as well as algorithms, Harvard is taking a novel approach.

In 2015, Grosz designed a new course called “Intelligent Systems: Design and Ethical Challenges.” An expert in artificial intelligence and a pioneer in natural language processing, Grosz turned to colleagues from Harvard’s philosophy department to co-teach the course. They interspersed into the course’s technical content a series of real-life ethical conundrums and the relevant philosophical theories necessary to evaluate them. This forced students to confront questions that, unlike most computer science problems, have no obvious correct answer.

Students responded. The course quickly attracted a following and by the second year 140 people were competing for 30 spots. There was a demand for more such courses, not only on the part of students, but by Grosz’s computer science faculty colleagues as well.

“The faculty thought this was interesting and important, but they didn’t have expertise in ethics to teach it themselves,” she said.

Barbara Grosz (from left), Jeffrey Behrends, and Alison Simmons hope Harvard’s approach to turning out graduates who understand ethics as well as algorithms becomes a national model.

Rose Lincoln/Harvard Staff Photographer

In response, Grosz and collaborator Alison Simmons, the Samuel H. Wolcott Professor of Philosophy, developed a model that draws on the expertise of the philosophy department and integrates it into a growing list of more than a dozen computer science courses, from introductory programming to graduate-level theory.

Under the initiative, dubbed Embedded EthiCS, philosophy graduate students are paired with computer science faculty members. Together, they review the course material and decide on an ethically rich topic that will naturally arise from the content. A graduate student identifies readings and develops a case study, activities, and assignments that will reinforce the material. The computer science and philosophy instructors teach side by side when the Embedded EthiCS material is brought to the classroom.

Grosz and her philosophy colleagues are at the center of a movement that they hope will spread to computer science programs around the country. Harvard’s “distributed pedagogy” approach is different from many university programs that treat ethics by adding a stand-alone course that is, more often than not, just an elective for computer science majors.

“Standalone courses can be great, but they can send the message that ethics is something that you think about after you’ve done your ‘real’ computer science work,” Simmons said. “We want to send the message that ethical reasoning is part of what you do as a computer scientist.”

Embedding ethics across the curriculum helps computer science students see how ethical issues can arise from many contexts, issues ranging from the way social networks facilitate the spread of false information to censorship to machine-learning techniques that empower statistical inferences in employment and in the criminal justice system.

Courses in artificial intelligence and machine learning are obvious areas for ethical discussions, but Embedded EthiCS also has built modules for less-obvious pairings, such as applied algebra.

“We really want to get students habituated to thinking: How might an ethical issue arise in this context or that context?” Simmons said.

“Standalone courses can be great, but they can send the message that ethics is something that you think about after you’ve done your ‘real’ computer science work.” Alison Simmons, Samuel H. Wolcott Professor of Philosophy

Curriculum at a glance

A sampling of classes from the Embedded EthiCS pilot program and the issues they address

  • Great Ideas in Computer Science: The ethics of electronic privacy
  • Introduction to Computer Science II: Morally responsible software engineering
  • Networks: Facebook, fake news, and ethics of censorship
  • Programming Languages: Verifiably ethical software systems
  • Design of Useful and Usable Interactive Systems: Inclusive design and equality of opportunity
  • Introduction to AI: Machines and moral decision making
  • Autonomous Robot Systems: Robots and work

David Parkes, George F. Colony Professor of Computer Science, teaches a wide-ranging undergraduate class on topics in algorithmic economics. “Without this initiative, I would have struggled to craft the right ethical questions related to rules for matching markets, or choosing objectives for recommender systems,” he said. “It has been an eye-opening experience to get students to think carefully about ethical issues.”

Grosz acknowledged that it can be a challenge for computer science faculty and their students to wrap their heads around often opaque ethical quandaries.

“Computer scientists are used to there being ways to prove problem set answers correct or algorithms efficient,” she said. “To wind up in a situation where different values lead to there being trade-offs and ways to support different ‘right conclusions’ is a challenging mind shift. But getting these normative issues into the computer system designer’s mind is crucial for society right now.”

Jeffrey Behrends, currently a fellow-in-residence at Harvard’s Edmond J. Safra Center for Ethics, has co-taught the design and ethics course with Grosz. Behrends said the experience revealed greater harmony between the two fields than one might expect.

“Once students who are unfamiliar with philosophy are introduced to it, they realize that it’s not some arcane enterprise that’s wholly independent from other ways of thinking about the world,” he said. “A lot of students who are attracted to computer science are also attracted to some of the methodologies of philosophy, because we emphasize rigorous thinking. We emphasize a methodology for solving problems that doesn’t look too dissimilar from some of the methodologies in solving problems in computer science.”

The Embedded EthiCS model has attracted interest from universities — and companies — around the country. Recently, experts from more than 20 institutions gathered at Harvard for a workshop on the challenges and best practices for integrating ethics into computer science curricula. Mary Gray, a senior researcher at Microsoft Research (and a fellow at Harvard’s Berkman Klein Center for Internet and Society), who helped convene the gathering, said that in addition to impeccable technical chops, employers increasingly are looking for people who understand the need to create technology that is accessible and socially responsible.

“Our challenge in industry is to help researchers and practitioners not see ethics as a box that has to be checked at the end, but rather to think about these things from the very beginning of a project,” Gray said.

Those concerns recently inspired the Association for Computing Machinery (ACM), the world’s largest scientific and educational computing society, to update its code of ethics for the first time since 1992.

In hope of spreading the Embedded EthiCS concept widely across the computer science landscape, Grosz and colleagues have authored a paper to be published in the journal Communications of the ACM and launched a website to serve as an open-source repository of their most successful course modules.

They envision a culture shift that leads to a new generation of ethically minded computer science practitioners.

“In our dream world, success will lead to better-informed policymakers and new corporate models of organization that build ethics into all stages of design and corporate leadership,” Behrends says.

More like this

Harvard Business Review Editor-in-Chief, Adi Ignatius talks with Brad Smith (left), President and Chief Legal Officer, Microsoft

Corporate activism takes on precarious role

programming ethics assignment

Bulgarian-born computer science student finds her niche

The experiment has also led to interesting conversations beyond the realm of computer science.

“We’ve been doing this in the context of technology, but embedding ethics in this way is important for every scientific discipline that is putting things out in the world,” Grosz said. “To do that, we will need to grow a generation of philosophers who will think about ways in which they can take philosophical ethics and normative thinking, and bring it to all of science and technology.”

Carefully designed course modules

At the heart of the Embedded EthiCS program are carefully designed, course-specific modules, collaboratively developed by faculty along with computer science and philosophy graduate student teaching fellows.

A module that Kate Vredenburgh, a philosophy Ph.D. student, created for a course taught by Professor Finale Doshi-Velez asks students to grapple with questions of how machine-learning models can be discriminatory, and how that discrimination can be reduced. An introductory lecture sets out a philosophical framework of what discrimination is, including the concepts of disparate treatment and impact. Students learn how eliminating discrimination in machine learning requires more than simply reducing bias in the technical sense. Even setting a socially good task may not be enough to reduce discrimination, since machine learning relies on predictively useful correlations and those correlations sometimes result in increased inequality between groups.

The module illuminates the ramifications and potential limitations of using a disparate impact definition to identify discrimination. It also introduces technical computer science work on discrimination — statistical fairness criteria. An in-class exercise focuses on a case in which an algorithm that predicts the success of job applicants to sales positions at a major retailer results in fewer African-Americans being recommended for positions than white applicants.

An out-of-class assignment asks students to draw on this grounding to address a concrete ethical problem faced by working computer scientists (that is, software engineers working for the Department of Labor). The assignment gives students an opportunity to apply the material to a real-world problem of the sort they might face in their careers, and asks them to articulate and defend their approach to solving the problem.

Share this article

You might like.

Researchers publish largest-ever dataset of neural connections

Panelists Melissa Dell, Alex Csiszar, and Latanya Sweeney at a Harvard symposium on artificial intelligence.

Symposium considers how technology is changing academia

Joonho Lee (top left), Rita Hamad, Fei Chen, Miaki Ishii, Jeeyun Chung, Suyang Xu, Stephanie Pierce, and Jarad Mason.

Seven projects awarded Star-Friedman Challenge grants

How old is too old to run?

No such thing, specialist says — but when your body is trying to tell you something, listen

Excited about new diet drug? This procedure seems better choice.

Study finds minimally invasive treatment more cost-effective over time, brings greater weight loss

How far has COVID set back students?

An economist, a policy expert, and a teacher explain why learning losses are worse than many parents realize

ACM Ethics

  • Related Organizations
  • Current ACM Code
  • Using the Code
  • Software Engineering Code
  • Other Codes
  • Previous Versions
  • Code 2018 Update Project
  • Enforcement Procedures and Reporting
  • Ask an Ethicist
  • Case Studies

2018 ACM Code

Acm code of ethics and professional conduct.

This Code and its guidelines were adopted by the ACM Council on June 22nd, 2018. The Preamble was amended on October 22, 2021 to reflect changes in ACM award policies.

Computing professionals’ actions change the world. To act responsibly, they should reflect upon the wider impacts of their work, consistently supporting the public good. The ACM Code of Ethics and Professional Conduct (“the Code”) expresses the conscience of the profession.

The Code is designed to inspire and guide the ethical conduct of all computing professionals, including current and aspiring practitioners, instructors, students, influencers, and anyone who uses computing technology in an impactful way. Additionally, the Code serves as a basis for remediation when violations occur. The Code includes principles formulated as statements of responsibility, based on the understanding that the public good is always the primary consideration. Each principle is supplemented by guidelines, which provide explanations to assist computing professionals in understanding and applying the principle.

Section 1 outlines fundamental ethical principles that form the basis for the remainder of the Code. Section 2 addresses additional, more specific considerations of professional responsibility. Section 3 guides individuals who have a leadership role, whether in the workplace or in a volunteer professional capacity. Commitment to ethical conduct is required of every ACM member, ACM SIG member, ACM award recipient, and ACM SIG award recipient. Principles involving compliance with the Code are given in Section 4.

The Code as a whole is concerned with how fundamental ethical principles apply to a computing professional’s conduct. The Code is not an algorithm for solving ethical problems; rather it serves as a basis for ethical decision-making. When thinking through a particular issue, a computing professional may find that multiple principles should be taken into account, and that different principles will have different relevance to the issue. Questions related to these kinds of issues can best be answered by thoughtful consideration of the fundamental ethical principles, understanding that the public good is the paramount consideration. The entire computing profession benefits when the ethical decision-making process is accountable to and transparent to all stakeholders. Open discussions about ethical issues promote this accountability and transparency.

1. GENERAL ETHICAL PRINCIPLES.

A computing professional should…

1.1 Contribute to society and to human well-being, acknowledging that all people are stakeholders in computing.

This principle, which concerns the quality of life of all people, affirms an obligation of computing professionals, both individually and collectively, to use their skills for the benefit of society, its members, and the environment surrounding them. This obligation includes promoting fundamental human rights and protecting each individual’s right to autonomy. An essential aim of computing professionals is to minimize negative consequences of computing, including threats to health, safety, personal security, and privacy. When the interests of multiple groups conflict, the needs of those less advantaged should be given increased attention and priority.

Computing professionals should consider whether the results of their efforts will respect diversity, will be used in socially responsible ways, will meet social needs, and will be broadly accessible. They are encouraged to actively contribute to society by engaging in pro bono or volunteer work that benefits the public good.

In addition to a safe social environment, human well-being requires a safe natural environment. Therefore, computing professionals should promote environmental sustainability both locally and globally.

1.2 Avoid harm.

In this document, “harm” means negative consequences, especially when those consequences are significant and unjust. Examples of harm include unjustified physical or mental injury, unjustified destruction or disclosure of information, and unjustified damage to property, reputation, and the environment. This list is not exhaustive.

Well-intended actions, including those that accomplish assigned duties, may lead to harm. When that harm is unintended, those responsible are obliged to undo or mitigate the harm as much as possible. Avoiding harm begins with careful consideration of potential impacts on all those affected by decisions. When harm is an intentional part of the system, those responsible are obligated to ensure that the harm is ethically justified. In either case, ensure that all harm is minimized.

To minimize the possibility of indirectly or unintentionally harming others, computing professionals should follow generally accepted best practices unless there is a compelling ethical reason to do otherwise. Additionally, the consequences of data aggregation and emergent properties of systems should be carefully analyzed. Those involved with pervasive or infrastructure systems should also consider Principle 3.7.

A computing professional has an additional obligation to report any signs of system risks that might result in harm. If leaders do not act to curtail or mitigate such risks, it may be necessary to “blow the whistle” to reduce potential harm. However, capricious or misguided reporting of risks can itself be harmful. Before reporting risks, a computing professional should carefully assess relevant aspects of the situation.

1.3 Be honest and trustworthy.

Honesty is an essential component of trustworthiness. A computing professional should be transparent and provide full disclosure of all pertinent system capabilities, limitations, and potential problems to the appropriate parties. Making deliberately false or misleading claims, fabricating or falsifying data, offering or accepting bribes, and other dishonest conduct are violations of the Code.

Computing professionals should be honest about their qualifications, and about any limitations in their competence to complete a task. Computing professionals should be forthright about any circumstances that might lead to either real or perceived conflicts of interest or otherwise tend to undermine the independence of their judgment. Furthermore, commitments should be honored.

Computing professionals should not misrepresent an organization’s policies or procedures, and should not speak on behalf of an organization unless authorized to do so.

1.4 Be fair and take action not to discriminate.

The values of equality, tolerance, respect for others, and justice govern this principle. Fairness requires that even careful decision processes provide some avenue for redress of grievances.

Computing professionals should foster fair participation of all people, including those of underrepresented groups. Prejudicial discrimination on the basis of age, color, disability, ethnicity, family status, gender identity, labor union membership, military status, nationality, race, religion or belief, sex, sexual orientation, or any other inappropriate factor is an explicit violation of the Code. Harassment, including sexual harassment, bullying, and other abuses of power and authority, is a form of discrimination that, amongst other harms, limits fair access to the virtual and physical spaces where such harassment takes place.

The use of information and technology may cause new, or enhance existing, inequities. Technologies and practices should be as inclusive and accessible as possible and computing professionals should take action to avoid creating systems or technologies that disenfranchise or oppress people. Failure to design for inclusiveness and accessibility may constitute unfair discrimination.

1.5 Respect the work required to produce new ideas, inventions, creative works, and computing artifacts.

Developing new ideas, inventions, creative works, and computing artifacts creates value for society, and those who expend this effort should expect to gain value from their work. Computing professionals should therefore credit the creators of ideas, inventions, work, and artifacts, and respect copyrights, patents, trade secrets, license agreements, and other methods of protecting authors’ works.

Both custom and the law recognize that some exceptions to a creator’s control of a work are necessary for the public good. Computing professionals should not unduly oppose reasonable uses of their intellectual works. Efforts to help others by contributing time and energy to projects that help society illustrate a positive aspect of this principle. Such efforts include free and open source software and work put into the public domain. Computing professionals should not claim private ownership of work that they or others have shared as public resources.

1.6 Respect privacy.

The responsibility of respecting privacy applies to computing professionals in a particularly profound way. Technology enables the collection, monitoring, and exchange of personal information quickly, inexpensively, and often without the knowledge of the people affected. Therefore, a computing professional should become conversant in the various definitions and forms of privacy and should understand the rights and responsibilities associated with the collection and use of personal information.

Computing professionals should only use personal information for legitimate ends and without violating the rights of individuals and groups. This requires taking precautions to prevent re- identification of anonymized data or unauthorized data collection, ensuring the accuracy of data, understanding the provenance of the data, and protecting it from unauthorized access and accidental disclosure. Computing professionals should establish transparent policies and procedures that allow individuals to understand what data is being collected and how it is being used, to give informed consent for automatic data collection, and to review, obtain, correct inaccuracies in, and delete their personal data.

Only the minimum amount of personal information necessary should be collected in a system. The retention and disposal periods for that information should be clearly defined, enforced, and communicated to data subjects. Personal information gathered for a specific purpose should not be used for other purposes without the person’s consent. Merged data collections can compromise privacy features present in the original collections. Therefore, computing professionals should take special care for privacy when merging data collections.

1.7 Honor confidentiality.

Computing professionals are often entrusted with confidential information such as trade secrets, client data, nonpublic business strategies, financial information, research data, pre-publication scholarly articles, and patent applications. Computing professionals should protect confidentiality except in cases where it is evidence of the violation of law, of organizational regulations, or of the Code. In these cases, the nature or contents of that information should not be disclosed except to appropriate authorities. A computing professional should consider thoughtfully whether such disclosures are consistent with the Code.

2. PROFESSIONAL RESPONSIBILITIES.

2.1 strive to achieve high quality in both the processes and products of professional work..

Computing professionals should insist on and support high quality work from themselves and from colleagues. The dignity of employers, employees, colleagues, clients, users, and anyone else affected either directly or indirectly by the work should be respected throughout the process. Computing professionals should respect the right of those involved to transparent communication about the project. Professionals should be cognizant of any serious negative consequences affecting any stakeholder that may result from poor quality work and should resist inducements to neglect this responsibility.

2.2 Maintain high standards of professional competence, conduct, and ethical practice.

High quality computing depends on individuals and teams who take personal and group responsibility for acquiring and maintaining professional competence. Professional competence starts with technical knowledge and with awareness of the social context in which their work may be deployed. Professional competence also requires skill in communication, in reflective analysis, and in recognizing and navigating ethical challenges. Upgrading skills should be an ongoing process and might include independent study, attending conferences or seminars, and other informal or formal education. Professional organizations and employers should encourage and facilitate these activities.

2.3 Know and respect existing rules pertaining to professional work.

“Rules” here include local, regional, national, and international laws and regulations, as well as any policies and procedures of the organizations to which the professional belongs. Computing professionals must abide by these rules unless there is a compelling ethical justification to do otherwise. Rules that are judged unethical should be challenged. A rule may be unethical when it has an inadequate moral basis or causes recognizable harm. A computing professional should consider challenging the rule through existing channels before violating the rule. A computing professional who decides to violate a rule because it is unethical, or for any other reason, must consider potential consequences and accept responsibility for that action.

2.4 Accept and provide appropriate professional review.

High quality professional work in computing depends on professional review at all stages. Whenever appropriate, computing professionals should seek and utilize peer and stakeholder review. Computing professionals should also provide constructive, critical reviews of others’ work.

2.5 Give comprehensive and thorough evaluations of computer systems and their impacts, including analysis of possible risks.

Computing professionals are in a position of trust, and therefore have a special responsibility to provide objective, credible evaluations and testimony to employers, employees, clients, users, and the public. Computing professionals should strive to be perceptive, thorough, and objective when evaluating, recommending, and presenting system descriptions and alternatives. Extraordinary care should be taken to identify and mitigate potential risks in machine learning systems. A system for which future risks cannot be reliably predicted requires frequent reassessment of risk as the system evolves in use, or it should not be deployed. Any issues that might result in major risk must be reported to appropriate parties.

2.6 Perform work only in areas of competence.

A computing professional is responsible for evaluating potential work assignments. This includes evaluating the work’s feasibility and advisability, and making a judgment about whether the work assignment is within the professional’s areas of competence. If at any time before or during the work assignment the professional identifies a lack of a necessary expertise, they must disclose this to the employer or client. The client or employer may decide to pursue the assignment with the professional after additional time to acquire the necessary competencies, to pursue the assignment with someone else who has the required expertise, or to forgo the assignment. A computing professional’s ethical judgment should be the final guide in deciding whether to work on the assignment.

2.7 Foster public awareness and understanding of computing, related technologies, and their consequences.

As appropriate to the context and one’s abilities, computing professionals should share technical knowledge with the public, foster awareness of computing, and encourage understanding of computing. These communications with the public should be clear, respectful, and welcoming. Important issues include the impacts of computer systems, their limitations, their vulnerabilities, and the opportunities that they present. Additionally, a computing professional should respectfully address inaccurate or misleading information related to computing.

2.8 Access computing and communication resources only when authorized or when compelled by the public good.

Individuals and organizations have the right to restrict access to their systems and data so long as the restrictions are consistent with other principles in the Code. Consequently, computing professionals should not access another’s computer system, software, or data without a reasonable belief that such an action would be authorized or a compelling belief that it is consistent with the public good. A system being publicly accessible is not sufficient grounds on its own to imply authorization. Under exceptional circumstances a computing professional may use unauthorized access to disrupt or inhibit the functioning of malicious systems; extraordinary precautions must be taken in these instances to avoid harm to others.

2.9 Design and implement systems that are robustly and usably secure.

Breaches of computer security cause harm. Robust security should be a primary consideration when designing and implementing systems. Computing professionals should perform due diligence to ensure the system functions as intended, and take appropriate action to secure resources against accidental and intentional misuse, modification, and denial of service. As threats can arise and change after a system is deployed, computing professionals should integrate mitigation techniques and policies, such as monitoring, patching, and vulnerability reporting. Computing professionals should also take steps to ensure parties affected by data breaches are notified in a timely and clear manner, providing appropriate guidance and remediation.

To ensure the system achieves its intended purpose, security features should be designed to be as intuitive and easy to use as possible. Computing professionals should discourage security precautions that are too confusing, are situationally inappropriate, or otherwise inhibit legitimate use.

In cases where misuse or harm are predictable or unavoidable, the best option may be to not implement the system.

3. PROFESSIONAL LEADERSHIP PRINCIPLES.

Leadership may either be a formal designation or arise informally from influence over others. In this section, “leader” means any member of an organization or group who has influence, educational responsibilities, or managerial responsibilities. While these principles apply to all computing professionals, leaders bear a heightened responsibility to uphold and promote them, both within and through their organizations.

A computing professional, especially one acting as a leader, should…

3.1 Ensure that the public good is the central concern during all professional computing work.

People—including users, customers, colleagues, and others affected directly or indirectly— should always be the central concern in computing. The public good should always be an explicit consideration when evaluating tasks associated with research, requirements analysis, design, implementation, testing, validation, deployment, maintenance, retirement, and disposal. Computing professionals should keep this focus no matter which methodologies or techniques they use in their practice.

3.2 Articulate, encourage acceptance of, and evaluate fulfillment of social responsibilities by members of the organization or group.

Technical organizations and groups affect broader society, and their leaders should accept the associated responsibilities. Organizations—through procedures and attitudes oriented toward quality, transparency, and the welfare of society—reduce harm to the public and raise awareness of the influence of technology in our lives. Therefore, leaders should encourage full participation of computing professionals in meeting relevant social responsibilities and discourage tendencies to do otherwise.

3.3 Manage personnel and resources to enhance the quality of working life.

Leaders should ensure that they enhance, not degrade, the quality of working life. Leaders should consider the personal and professional development, accessibility requirements, physical safety, psychological well-being, and human dignity of all workers. Appropriate human-computer ergonomic standards should be used in the workplace.

3.4 Articulate, apply, and support policies and processes that reflect the principles of the Code.

Leaders should pursue clearly defined organizational policies that are consistent with the Code and effectively communicate them to relevant stakeholders. In addition, leaders should encourage and reward compliance with those policies, and take appropriate action when policies are violated. Designing or implementing processes that deliberately or negligently violate, or tend to enable the violation of, the Code’s principles is ethically unacceptable.

3.5 Create opportunities for members of the organization or group to grow as professionals.

Educational opportunities are essential for all organization and group members. Leaders should ensure that opportunities are available to computing professionals to help them improve their knowledge and skills in professionalism, in the practice of ethics, and in their technical specialties. These opportunities should include experiences that familiarize computing professionals with the consequences and limitations of particular types of systems. Computing professionals should be fully aware of the dangers of oversimplified approaches, the improbability of anticipating every possible operating condition, the inevitability of software errors, the interactions of systems and their contexts, and other issues related to the complexity of their profession—and thus be confident in taking on responsibilities for the work that they do.

3.6 Use care when modifying or retiring systems.

Interface changes, the removal of features, and even software updates have an impact on the productivity of users and the quality of their work. Leaders should take care when changing or discontinuing support for system features on which people still depend. Leaders should thoroughly investigate viable alternatives to removing support for a legacy system. If these alternatives are unacceptably risky or impractical, the developer should assist stakeholders’ graceful migration from the system to an alternative. Users should be notified of the risks of continued use of the unsupported system long before support ends. Computing professionals should assist system users in monitoring the operational viability of their computing systems, and help them understand that timely replacement of inappropriate or outdated features or entire systems may be needed.

3.7 Recognize and take special care of systems that become integrated into the infrastructure of society.

Even the simplest computer systems have the potential to impact all aspects of society when integrated with everyday activities such as commerce, travel, government, healthcare, and education. When organizations and groups develop systems that become an important part of the infrastructure of society, their leaders have an added responsibility to be good stewards of these systems. Part of that stewardship requires establishing policies for fair system access, including for those who may have been excluded. That stewardship also requires that computing professionals monitor the level of integration of their systems into the infrastructure of society. As the level of adoption changes, the ethical responsibilities of the organization or group are likely to change as well. Continual monitoring of how society is using a system will allow the organization or group to remain consistent with their ethical obligations outlined in the Code. When appropriate standards of care do not exist, computing professionals have a duty to ensure they are developed.

4. COMPLIANCE WITH THE CODE.

4.1 uphold, promote, and respect the principles of the code..

The future of computing depends on both technical and ethical excellence. Computing professionals should adhere to the principles of the Code and contribute to improving them. Computing professionals who recognize breaches of the Code should take actions to resolve the ethical issues they recognize, including, when reasonable, expressing their concern to the person or persons thought to be violating the Code.

4.2 Treat violations of the Code as inconsistent with membership in the ACM.

Each ACM member should encourage and support adherence by all computing professionals regardless of ACM membership. ACM members who recognize a breach of the Code should consider reporting the violation to the ACM, which may result in remedial action as specified in the ACM’s Code of Ethics and Professional Conduct Enforcement Policy .

The Code and guidelines were developed by the ACM Code 2018 Task Force: Executive Committee Don Gotterbarn (Chair), Bo Brinkman, Catherine Flick, Michael S Kirkpatrick, Keith Miller, Kate Varansky, and Marty J Wolf. Members: Eve Anderson, Ron Anderson, Amy Bruckman, Karla Carter, Michael Davis, Penny Duquenoy, Jeremy Epstein, Kai Kimppa, Lorraine Kisselburgh, Shrawan Kumar, Andrew McGettrick, Natasa Milic-Frayling, Denise Oram, Simon Rogerson, David Shama, Janice Sipior, Eugene Spafford, and Les Waguespack. The Task Force was organized by the ACM Committee on Professional Ethics. Significant contributions to the Code were also made by the broader international ACM membership. This Code and its guidelines were adopted by the ACM Council on June 22nd, 2018.

This Code may be published without permission as long as it is not changed in any way and it carries the copyright notice. Copyright (c) 2018 by the Association for Computing Machinery.

Be sure to check out our guide on Using the Code for decision making for practicing engineers.

The official repository of the ACM Code of Ethics and Professional Conduct is  https://www.acm.org/about-acm/acm-code-of-ethics-and-professional-conduct . This Code constitutes Bylaw 15 of the Bylaws of the Association for Computing Machinery.

  • Search for:
  • Find a company

Unethical and Illegal Practices in Coding: From Prevention to Action

Jun 04, 2019

Unethical and Illegal Practices in Coding:  From Prevention to Action

Cases of programming practices being abused and unethical and/or illegal coding have been reported all over the media, which isn’t great publicity for developers and engineers. The question of who bears the responsibility remains on everyone’s lips. But while employers are legally more accountable for bad decisions involving the company, developers won’t always find themselves exempt from criminal prosecution either. A senior executive at Volkswagen may have been sentenced to seven years in prison after the emissions scandal, but a VW engineer also got 40 months of jail time for his role in the case. Therefore, if you feel a request from an employer can’t be trusted, you need to think twice before embarking on a risky journey: It might end up costing you more than your conscience. But how to prevent these practices happening further upstream and what can you do to protect yourself from unethical and/or illegal requests at work?

A boom in cases

The Volkswagen scandal, which erupted in September 2015, made a lot of noise: Authorities found out that the firm’s engineers had been falsely programming their cars to meet US environmental standards in laboratory testing, when they were actually emitting up to 40 times more CO2 in real life. The illegal software had been implemented in about 11 million cars worldwide. It ended with a US federal judge ordering Volkswagen to “pay a $2.8 billion criminal fine for rigging diesel-powered vehicles to cheat on government emissions tests,” reported The Wall Street Journal in 2017.

The famous transport company Uber has also been under the spotlight during the past few years because of several algorithms its software developers created. The investigation revealed that these algorithms contributed to suspicious practices, such as exploiting underpaid drivers and ignoring regulations in cities around the world. Meanwhile, Supinfo, the private international university that specializes in computer science, reported how, following the publication of Bill Sourour ’s account of the code he’s “still ashamed of” , which shook the whole IT industry, developers from all over the world started sharing their own experiences of being asked to participate in unethical and/or illegal programming during their careers.

To explain, in 2016, Sourour, that forerunner in this wave of denunciation, posted on his blog about his worst experience of an unethical request. In the post, he told how, at 21, he had landed a full-time coding job with an interactive marketing firm in Toronto, Canada, a country with strict regulations in place regarding the advertising of prescription drugs. Many of the firm’s clients were large pharmaceutical companies, who as a way of circumventing the law, would create websites presenting general information about the symptoms their drugs were meant to address. “Then, if a visitor could prove they had a prescription, they were given access to a patient portal with more specific info about the drug,” wrote Sourour.

For one website, targeted at young women, he was asked to code up a quiz whose questions would always lead to the client’s drug as the final recommendation. This practice was not illegal, but one day he was alerted to a news report that a young girl who had taken the drug had killed herself. That was when he discovered that some of the main side effects of the drug were severe depression and suicidal thoughts. And he has never been able to forgive himself for writing the code. “As developers, we are often one of the last lines of defense against potentially dangerous and unethical practices,” he wrote. It was time to take action.

Unethical vs. illegal: Where to draw the line?

If illegal requests usually seem clearer and easier to turn down, ethical matters can be fuzzier and more challenging to manage. But what does the term ethics stand for officially and what does it cover? Florence Mulliez, a partner at the French law firm FIDAL who specializes in digital and compliance technology, defines ethics as “the collective conscience in front of the major issues that personal-data privacy is posing.” Ethics is distinct from the legal sphere and more related to moral principles, although the line between ethics and law is blurred, as Maxime D’Angelo Petrucci, a data and technology attorney at Clifford Chance, explains. “When it comes to the field of personal-data protection, the line between legal and ethical is very thin and one does not go without the other,” he says. “One major aspect of GDPR [General Data Protection Regulation, a new regulation in EU law on data protection and privacy] is that it brings together both visions, integrating the notions of ‘lawfulness’ and ‘fairness’. It helps consider both aspects in new cases.”

For clarity, lawfulness is when something is allowed or permitted by law, while fairness is defined as impartial fair treatment, in line with individuals’ expectations, without any discrimination.

So can an ethical dilemma be brought before a court of law? “It can be admissible because, in France, we work with a civil-law-based system,” explains Mulliez. “Here, judges tend to apply legislation in its strictest sense, while Anglo-Saxon countries reason more based on the notion of equity.” This leaves more room for interpretation and unusual cases than it does in France, but serious ethical matters deserve a shot in court when needed. As mentioned, the line between legal and ethical is very thin. “The admissibility of ethical matters will have to be implemented over the long term anyway, especially with the new challenges that artificial intelligence will bring,” adds Mulliez.

How are developers reacting to unethical requests?

In 2018, a Stack Overflow study examined how 100,000 developers worldwide would respond when faced with an unethical request. Only 58.5% stated they would say no if they were asked to write code for an unethical purpose, while 36.6% were more ambivalent and claimed it would depend on what it was, and 4.8% said they would be prepared to do it. What we can take from those figures is that there is definitely a gray area around ethical matters, making it difficult to assess the lines that shouldn’t be crossed. More than half of the respondents, however, considered upper management responsible for ethical matters and any consequences that could arise from bad decisions, meaning that when it comes to such matters, prevention is better than the cure. The problem is there is no set of common standards that apply to the whole industry.

How can developers react?

Opposing the boss is no easy task, especially when there is no concrete law to cover the issue. The first thing to do when a request doesn’t feel right is to take your time. Do not answer straight away. Websites such as CNIL , which details GDPR and rules applying to personal-data-based technologies, give you an overview of the legitimacy of the request from a legal point of view. If it falls under the scope of ethics, you will have to trust your gut. Does the request seem reasonable or not? Does it match your personal values and what you wish to achieve professionally? Could it lead to further unreasonable tasks? What is at stake for the company and the customers? What do you risk, and what impact will it have on your vision of your work, of your company, and your reputation? You will have to process all this information before deciding. But always start by taking the time to evaluate the situation, the possibilities, and eventual compromises that can be made.

If there’s no room for compromise and your boss sticks to their opinion, you will have to take action. Timing is important, as Mulliez explains: “The first step is to take advice from a legal expert before production starts. You must take action as far up the chain as possible when the project is still at at the prototype stage. Afterwards, written proof of your discussions is essential. This falls under social law, but the best thing to do is to write a clear refusal to your direct supervisor, with the HR department in copy. One day, you might have to show that you clearly stated your refusal, so it is important to keep track of email discussions and any paper trail documenting the project. Legally, you need to bring proof that you clearly opposed the project if you do not want to be held accountable.” Express your reservations, making clear that you are not comfortable fulfilling the task requested because it does not match your values. This shows mental strength and professional conscience without any aggression.

Codes of ethics: A new way to prevent bad practice?

In order to prevent questionable practices that are not governed by law, some firms have started implementing a code of ethics for their employees. “There is indeed a trend among big tech companies to use codes of conduct. This is a practice encouraged by regulators and public authorities, especially in the field of artificial intelligence,” says D’Angelo Petrucci. The scope of these codes covers all kinds of practices that are discouraged in the company and mainly serve to prevent any discrimination or other unfair use of personal data. Anyone can prepare a code of ethics, but D’Angelo Petrucci recommends collaboration: “You do not officially need legal counseling to build up a code of conduct, but it is recommended you involve various stakeholders, including the company’s legal department, in its creation. This will help you proceed in line with employment, corporate, privacy, and all other procedures that must be complied with.”

However, Mulliez cautions to remain careful with expectations of codes like this, as they “don’t have a true legal value.” She adds that, on the other hand, “no employee can be forced to sign a code of conduct and an employee has the right to express reservations about some clauses.” Nevertheless, codes of conduct remain a useful insight into your company’s values and the practices it discourages or includes. It also indicates that the company is willing to preserve a sense of ethics and good practice in its business as well as in its culture, and will likely take action if needed. Companies should therefore be encouraged to set down common standards expected from employees and the moral guidelines to refer to in case of disagreements.

Stricter regulations: The first step in achieving better behaviors and data protection

The increase in scandals concerning unethical programming raises the question of regulation: What can authorities do to prevent such situations in the future? D’Angelo Petrucci explains that this surge of cases correlates with the increasing presence of data in our lives: “More and more companies use personal data to create, deliver, and improve their products and services. It has become an essential part of business. And the more companies use data to do business, the greater the risk of misuse.”

Mulliez points out that, with GDPR that applies to companies in the European Economic Area (EEA), and to those outside the EEA in certain circumstances, the sanctions for data misuse have become more significant and fines can now reach 2%-4% of turnover. “We’ve always had several regulations in France, such as law number 78-17—Informatique et libertés (Computing and Freedom)—passed in 1978, but much fewer sanctions, and their scope is much smaller than what GDPR can cover. It’s a new regulation that includes the notion of ‘Privacy by Design.’ This places specific prerequisites on a product or service you want to create, such as personal-data protection. Companies are now also obliged to carry out a Data Protection Impact Assessment (DPIA) when their project raises or is likely to raise high privacy risks. This tool assesses the concrete impact of personal-data use, checking the data’s nature, their purpose, and defining how exposed they are.”

This new kind of strict regulation is progressively expanding abroad, in a few states of the US and in Japan. The US cannot stay passive in the face of the major changes being implemented in Europe, but it does not intend to reproduce its model, at least not completely. In July 2018, the White House claimed that it was considering “a consumer privacy protection policy that is the appropriate balance between privacy and prosperity.” Obviously, GDPR is setting a new global standard and will change the game, although Mulliez warns, “There will be a need for adjustment, especially when facing new situations with the development of AI and blockchain. But I believe that the right balance will be found to protect both sides’ interests, companies and users.”

In conclusion, the new regulation should progressively encourage respectful and conscientious practices. Mentalities change and consumers are becoming more aware of what they share, forcing companies to protect consumers’ interests more. D’Angelo Petrucci concludes that, “We are facing a whole new trend, where public authorities, regulators, experts, and consumer groups are thinking about and trying to address emerging ethical issues related to personal data, technology, and profiling.” However, profit will always lead some managers to cross the line and to encourage them to take an unethical path. Keep in mind that you have the right to say no and keep track of all the exchanges that relate to an issue. And remember, prevention is better than the cure. Do not hesitate to try to focus your company more on ethical issues and to encourage the creation of a code of ethics. “In the future,” says D’Angelo Petrucci, “we can expect more and more tech-related codes of conduct to emerge in companies and be used as serious guidelines for professional ethics.”

This article is part of Behind the Code, the media for developers, by developers. Discover more articles and videos by visiting Behind the Code !

Want to contribute? Get published!

Follow us on Twitter to stay tuned!

Illustrations by WTTJ

More inspiration: Offbeat

Thinking outside the box once in a while is essential while coding. Take a step back on your day-to-day developer tasks and discover how other people use the same technologies and tools as you to create amazing things.

programming ethics assignment

Poem #4: Firefighting

Enjoy this poem program written in the Python programming language that generates an ASCII art fire animation.

Dec 19, 2019

programming ethics assignment

Poem #3: Avalanche of Stars

Enjoy this poem program written in the C programming language that generates ASCII art mountainscapes.

Nov 12, 2019

programming ethics assignment

4 Things to Expect When You’re the Only Security Engineer on a Developers Team

Here are some things security engineers need to be aware of before their first day if they’re going to survive on a team of developers.

Oct 29, 2019

programming ethics assignment

Poem #2: Alphanumeric

Enjoy this poem program written in the Julia language that uses a simple statistical model to generate endless paragraphs of redacted text.

Oct 03, 2019

programming ethics assignment

Tired of Chrome? Try These Six Browsers

Are you tired of Chrome, Safari and Firefox? Here are six alternative browsers to consider if you’re looking for a different Internet experience!

Oct 01, 2019

The newsletter that does the job

Want to keep up with the latest articles? Twice a week you can receive stories, jobs, and tips in your inbox.

programming ethics assignment

Looking for your next job opportunity?

Over 200,000 people have found a job with Welcome to the Jungle.

Embedding ethics in computer science curriculum

Harvard initiative seen as a national model.

Barbara Grosz has a fantasy that every time a computer scientist logs on to write an algorithm or build a system, a message will flash across the screen that asks, “Have you thought about the ethical implications of what you’re doing?”

Until that day arrives, Grosz, the Higgins Professor of Natural Sciences at the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS), is working to instill in the next generation of computer scientists a mindset that considers the societal impact of their work, and the ethical reasoning and communications skills to do so.

“Ethics permeates the design of almost every computer system or algorithm that’s going out in the world,” Grosz said. “We want to educate our students to think not only about what systems they  could  build, but whether they  should build those systems and how they should design those systems.”

At a time when computer science departments around the country are grappling with how to turn out graduates who understand ethics as well as algorithms, Harvard is taking a novel approach.

In 2015, Grosz designed a new course called “Intelligent Systems: Design and Ethical Challenges.” An expert in artificial intelligence and a pioneer in natural language processing, Grosz turned to colleagues from Harvard’s philosophy department to co-teach the course. They interspersed into the course’s technical content a series of real-life ethical conundrums and the relevant philosophical theories necessary to evaluate them. This forced students to confront questions that, unlike most computer science problems, have no obvious correct answer.

Students responded. The course quickly attracted a following and by the second year 140 people were competing for 30 spots. There was a demand for more such courses, not only on the part of students, but by Grosz’s computer science faculty colleagues as well.

“The faculty thought this was interesting and important, but they didn’t have expertise in ethics to teach it themselves,” she said.

In response, Grosz and collaborator Alison Simmons, the Samuel H. Wolcott Professor of Philosophy, developed a model that draws on the expertise of the philosophy department and integrates it into a growing list of more than a dozen computer science courses, from introductory programming to graduate-level theory.

Under the initiative, dubbed Embedded EthiCS, philosophy graduate students are paired with computer science faculty members. Together, they review the course material and decide on an ethically rich topic that will naturally arise from the content. A graduate student identifies readings and develops a case study, activities, and assignments that will reinforce the material. The computer science and philosophy instructors teach side by side when the Embedded EthiCS material is brought to the classroom.

Grosz and her philosophy colleagues are at the center of a movement that they hope will spread to computer science programs around the country. Harvard’s “distributed pedagogy” approach is different from many university programs that treat ethics by adding a stand-alone course that is, more often than not, just an elective for computer science majors.

“Standalone courses can be great, but they can send the message that ethics is something that you think about after you’ve done your ‘real’ computer science work,” Simmons said. “We want to send the message that ethical reasoning is part of what you do as a computer scientist.”

Embedding ethics across the curriculum helps computer science students see how ethical issues can arise from many contexts, issues ranging from the way social networks facilitate the spread of false information to censorship to machine-learning techniques that empower statistical inferences in employment and in the criminal justice system.

Courses in artificial intelligence and machine learning are obvious areas for ethical discussions, but Embedded EthiCS also has built modules for less-obvious pairings, such as applied algebra.

“We really want to get students habituated to thinking: How might an ethical issue arise in this context or that context?” Simmons said.

Standalone courses can be great, but they can send the message that ethics is something you think about after you've done you "real" computer science work.

David Parkes, George F. Colony Professor of Computer Science, teaches a wide-ranging undergraduate class on topics in algorithmic economics. “Without this initiative, I would have struggled to craft the right ethical questions related to rules for matching markets, or choosing objectives for recommender systems,” he said. “It has been an eye-opening experience to get students to think carefully about ethical issues.”

Grosz acknowledged that it can be a challenge for computer science faculty and their students to wrap their heads around often opaque ethical quandaries.

“Computer scientists are used to there being ways to prove problem set answers correct or algorithms efficient,” she said. “To wind up in a situation where different values lead to there being trade-offs and ways to support different ‘right conclusions’ is a challenging mind shift. But getting these normative issues into the computer system designer’s mind is crucial for society right now."

Jeffrey Behrends, currently a fellow-in-residence at Harvard’s Edmond J. Safra Center for Ethics, has co-taught the design and ethics course with Grosz. Behrends said the experience revealed greater harmony between the two fields than one might expect.

“Once students who are unfamiliar with philosophy are introduced to it, they realize that it’s not some arcane enterprise that’s wholly independent from other ways of thinking about the world,” he said. “A lot of students who are attracted to computer science are also attracted to some of the methodologies of philosophy, because we emphasize rigorous thinking. We emphasize a methodology for solving problems that doesn’t look too dissimilar from some of the methodologies in solving problems in computer science.”

The Embedded EthiCS model has attracted interest from universities — and companies — around the country. Recently, experts from more than 20 institutions gathered at Harvard for a workshop on the challenges and best practices for integrating ethics into computer science curricula. Mary Gray, a senior researcher at Microsoft Research (and a fellow at Harvard’s Berkman Klein Center for Internet and Society), who helped convene the gathering, said that in addition to impeccable technical chops, employers increasingly are looking for people who understand the need to create technology that is accessible and socially responsible.

“Our challenge in industry is to help researchers and practitioners not see ethics as a box that has to be checked at the end, but rather to think about these things from the very beginning of a project,” Gray said.

Those concerns recently inspired the Association for Computing Machinery (ACM), the world’s largest scientific and educational computing society, to update its code of ethics for the first time since 1992.

programming ethics assignment

Curriculum at a glance

A sampling of classes from the Embedded EthiCS pilot program and the issues they address

In hope of spreading the Embedded EthiCS concept widely across the computer science landscape, Grosz and colleagues have authored a  paper  to be published in the journal Communications of the ACM and launched a  website  to serve as an open-source repository of their most successful course modules.

They envision a culture shift that leads to a new generation of ethically minded computer science practitioners.

“In our dream world, success will lead to better-informed policymakers and new corporate models of organization that build ethics into all stages of design and corporate leadership,” Behrends says.

The experiment has also led to interesting conversations beyond the realm of computer science.

“We’ve been doing this in the context of technology, but embedding ethics in this way is important for every scientific discipline that is putting things out in the world,” Grosz said. “To do that, we will need to grow a generation of philosophers who will think about ways in which they can take philosophical ethics and normative thinking, and bring it to all of science and technology.”

Carefully designed course modules

At the heart of the Embedded EthiCS program are carefully designed, course-specific modules, collaboratively developed by faculty from computer science and philosophy graduate student teaching fellows . .

A module created that Kate Vredenburgh, a philosophy Ph.D. student,  for a course taught by Professor Finale Doshi-Velez asks students to grapple with questions of how machine-learning models can be discriminatory, and how that discrimination can be reduced. An introductory lecture sets out a philosophical framework of what discrimination is, including the concepts of disparate treatment and impact. Students learn how eliminating discrimination in machine learning requires more than simply reducing bias in the technical sense. Even setting a socially good task may not be enough to reduce discrimination, since machine learning relies on predictively useful correlations and those correlations sometimes result in increased inequality between groups.

The module illuminates the ramifications and potential limitations of using a disparate impact definition to identify discrimination. It also introduces technical computer science work on discrimination — statistical fairness criteria. An in-class exercise focuses on a case in which an algorithm that predicts the success of job applicants to sales positions at a major retailer results in fewer African-Americans being recommended for positions than white applicants.

An out-of-class assignment asks students to draw on this grounding to address a concrete ethical problem faced by working computer scientists (that is, software engineers working for the Department of Labor). The assignment gives students an opportunity to apply the material to a real-world problem of the sort they might face in their careers, and asks them to articulate and defend their approach to solving the problem

Related story

Artificial Intelligence and Ethics , Harvard Magazine

Topics: Academics , Computer Science , Ethics

Cutting-edge science delivered direct to your inbox.

Join the Harvard SEAS mailing list.

Related News

Harvard SEAS graduate student Michael Finn-Henry standing between interim president Alan Garber and Harvard Innovation Labs director Matt Segneri

Three SEAS ventures take top prizes at President’s Innovation Challenge

Start-ups in emergency medicine, older adult care and quantum sensing all take home $75,000

Applied Physics , Awards , Computer Science , Entrepreneurship , Health / Medicine , Industry , Master of Design Engineering , Materials Science & Mechanical Engineering , MS/MBA , Quantum Engineering

A group of Harvard SEAS seniors with Dean David Parkes, holding awards for outstanding engineering projects

Six seniors recognized with Dean’s Awards for outstanding capstone projects

Topics include a method to detect earthquake victims and an image-to-text application for the visually impaired

Academics , Dean , Awards , Bioengineering , Electrical Engineering , Environmental Science & Engineering , Materials Science & Mechanical Engineering

Two men wearing hospital scrubs, two wearing blue jackets with the logo for the company EndoShunt, in front of medical equipment

Seven SEAS teams named President’s Innovation Challenge finalists

Start-ups will vie for up to $75,000 in prize money

Computer Science , Design , Electrical Engineering , Entrepreneurship , Events , Master of Design Engineering , Materials Science & Mechanical Engineering , MS/MBA

programming ethics assignment

  • Prep Courses
  • Coding Questions
  • Behavioral Questions
  • Build Your Portfolio
  • Goal-Setting
  • Productivity
  • Start a Blog
  • Software Engineer
  • Game Development
  • Blockchain Developer
  • Cloud Computing
  • Web3 Developer
  • The Complete Software Developer’s Career Guide
  • 10 Steps to Learn Anything Quickly
  • How to Market Yourself as a Software Developer
  • Create a Blog That Boosts Your Career
  • 10 Ways to Make Money From Your Blog
  • Best Coding Hardware
  • Blockchain Languages

What It Takes To Be an Ethical Programming Professional

Text Only 02

Written By Andrej Kovačević

programming ethics assignment

That’s a bigger issue than you might imagine. In today’s world, software is involved in almost every part of our lives. It manages the complex supply chains that put food on our supermarket shelves. It helps first responders figure out how to reach us when there’s an emergency. It even makes decisions that can affect our very freedom to walk the streets.

As programmers, we don’t give those things much thought as we do our work and understandably so. They’re issues that seem far above our pay grade. But the reality is that every line of code we create has consequences, and they can be severe. For evidence, look no further than the apparent control software flaws that doomed two of Boeing’s 737 Max jets–claiming 346 lives.

It’s easy to argue that a situation like that one resulted from a cascade of failures that were the fault of the whole organization–and that’s certainly true.

It’s also easy for the average independent programmer to dismiss the scenario as something that doesn’t apply to them, since they don’t work for companies producing software with life-or-death stakes, and that could be true as well.

But when you consider that some of the software in question may have originated at an outsourced programming firm that didn’t have the necessary expertise for the work, it starts becoming clear that we, as an industry, can’t afford to keep passing the buck. There’s no way to know when we’ll be faced with making a decision about the ethics involved in something we’re working on.

For that reason, it’s important for all programmers, developers, and software engineers to take steps to define some ethical boundaries in advance. We, as an industry, have to seek out the right knowledge and decide where we draw the line between earning our pay and sleeping well at night. Here are some tips on where to begin that journey and how to make sure to be an ethical programming professional.

Understand What Ethics Is (and What It Isn’t)

As programmers, we all started somewhere. For some of us, it was tinkering with bits of code and dabbling in online coursework to learn how things worked. For others, it was in a university, struggling with assignments and working toward a computer science degree. No matter the path, the one commonality we (almost) all share is that nobody took the time to teach us about ethics.

If you ask almost any programmer if they’ve run into ethical issues in their work, they’re likely to say no. If you’re more specific though—like asking if an employer has ever asked them to forego unit testing in the interest of time or if they’ve felt pressure to misrepresent the status of some part of a project to a client—and the answer you’ll get is likely to change.

The reason for it is that it’s just plain hard to identify what a programming-specific ethical issue is even when it’s staring you in the face. Still, there’s no programmer working today who hasn’t faced one or two such issues. Things like casual violations of intellectual property rights when it’s clear some code has been lifted from another project or a client’s insistence on intrusive activity logging without any outward disclosure that it’s taking place come to mind. Most of the time, they’re practices that are so commonplace that they just don’t stand out—unless you know what you’re looking for.

To help, every programming professional should take some time to study the concept of ethics and gain some knowledge in that area. It’s easy to do. A good place to start is to simply reflect a little bit on your own concept of right and wrong. Think about some of the work you’ve done in the past, and ask yourself: “if I were on the other side of the screen, would I be OK with this?”

You can even do some independent study to brush up on your knowledge of applied ethics. There’s certainly no shortage of online resources on the topic. MIT even offers free online courses in ethics, as do several other reputable universities.

If you have time, pick up a copy of author Sara Baase’s book A Gift of Fire: Social, Legal, and Ethical Issues for Computing Technology . It’s a deep dive into the myriad ways that ethical questions intersect with modern technology, and it contains a wealth of real-world examples to illustrate all of the concepts. Just make sure to buy a legitimate copy of the book. Although it may be available for free elsewhere online, going that route would be highly unethical in itself.

Don’t Cover Up the Lapses of Others

programming ethics assignment

That’s what happened to Medium user dCFO, who inherited a piece of Medicare billing software and discovered that it had been programmed to defraud the government . In that particular case, the solution was to report the fraud to the proper authorities. In every such case, leaving well enough alone is always the wrong way to go.

To uphold high ethical standards as a programmer means calling out substandard or inappropriate code wherever you find it. Whenever possible, you should report anything unusual to your immediate supervisor if you have one. If they won’t listen, keep reporting up the chain of command until you find someone willing to take action. If you’re working as a freelancer, you should communicate your concerns with your client directly.

If you’re in a situation where those above you seem content to sweep an ethical issue under the rug, or worse—are complicit in the problem—you might have to look for another way to call attention to the situation. It’s not always necessary to put your own name on the report (as in the case above, where the programmer tipped off the authorities anonymously), but it’s critical to call attention to anything shady that you can’t fix on your own. To do otherwise means you’re passing the buck, and the next programmer that encounters the issue is going to assume you were in on it too. If you make a documented effort to fix the problem, you’ll be in the right no matter how the situation eventually turns out.

Don’t Be Afraid To Say No

Programmers, and especially the independent variety, make their living moving from project to project. That brings with it a certain urgency to take on whatever work comes along no matter what it is. Needless to say, that can lead to a variety of ethical conflicts. After all, freelance programmers often don’t know all of the specifics of what they’re going to be working on until they’re knee-deep in code.

Once that happens, there’s tremendous pressure to keep saying yes to the client in the interest of getting paid for all of the effort already made. But what happens when the client asks for a feature or a change that’s unethical when you’ve already invested countless hours of work? If you went into the project with your ethical boundaries firmly established, the answer is that you have to put your foot down and say no.

In fact, you have to also make it clear why you’re saying no. As much as those conversations might be unpleasant to have, they’re necessary to let the client know that you understand that what they’re asking for is unethical. When you have to do this, it’s important to start the conversation without being confrontational. Don’t lead with the assumption that the client knows they’re in the wrong. Pose the issue to them as a question, leading with phrases like “Did you know that …” or “I think this is a problem, but can you explain to me why you don’t see it that way?”

Most of the time, the client will back down when called out in this manner. Sometimes, they’ll even respond with genuine surprise—after all, nonprogrammers often fail to see the ethical implications of what they’re asking you to do; they just want results. If they don’t, though, the next step is to walk away. As much as it may hurt financially, it’s the right thing to do.

Engage in Ethics Discussions With Other Professionals

If there’s one thing that programmers tend to do with regularity, it’s talk about their projects. That’s why there are so many blogs and communities dedicated to programming in all its forms. As it turns out, those are also excellent forums for discussing the ethical implications of the things we work on.

For a start, make it a habit to ask other professionals what they think when they’re confronted with a challenging ethical dilemma. You might be surprised at how many others have faced similar problems, and they may be able to offer valuable advice on what to do. Even if your situation is unique, it’s still worthwhile to bounce the problem off others who can understand what you’re talking about.

programming ethics assignment

As painful as the episode was, it was a learning experience that sparked a conversation around programming ethics that encouraged others to share ethical issues they’d encountered. Almost like a great torrent had been unleashed, simultaneous discussions on StackExchange , Stack Overflow , Reddit’s r/programming forum, and elsewhere sprang up with countless programmers sharing their experiences. Those places remain a great sounding board for programmers who encounter ethical dilemmas, even today. The more people that take the time to tell their stories, the better off we, as an industry, will be.

Pledge To Do Your Best

Although there aren’t many professional organizations for programmers to join that provide an ethical framework to follow, there are some published pledges that can serve as basic words to live by for the average programmer. A more in-depth resource that makes a fantastic guidebook is Robert Martin’s book The Clean Coder: A Code of Conduct for Professional Programmers . It’s filled with plenty of actionable advice that can help any programmer to embrace ethical practices and to elevate their craft. They’re both excellent places to start.

Remember, though, Rome wasn’t built in a day. It’s unreasonable to expect that you’re going to make the right ethical call in every situation you encounter. It takes plenty of time to get used to standing by your ethical boundaries and applying them to everything you do. Nor can you expect the programmers you work with to suddenly embrace ethical programming standards. No matter how those around you act, it’s critical to stand firm. Eventually, your attitude will rub off on those around you.

The main thing is to always approach every project with a discerning eye and stay on the lookout for anything that seems less-than-aboveboard in each project’s requirements or the existing code involved. Then, make it a point to always do your best to do the right thing and maintain your integrity.

If you’ve made the right preparations and have taken steps to spell out your own ethical boundaries in advance, you won’t have to wonder what to do when something questionable crosses your desk. You’ll be able to spot problematic situations early and be able to focus on your response. And when that happens, your response will be more sure-footed and likely to yield a positive ethical result.

It won’t always be easy. It won’t always be good for your bottom line. But maintaining high ethical standards is a responsibility all programmers share, and as our work becomes more deeply ingrained in the world around us, the stakes grow ever higher. Remember, the results of all of our work are all around us. Lives and livelihoods depend on it. And there’s no way to know when an ethical judgment call you make today will have real-world consequences in the future.

So, the most critical takeaway is to spend the time now to develop your ethical boundaries and learn how to handle complicated ethical situations when they come up. It’s more than worth the effort. Remember, at the end of the day, we’re all in this together. If we all do our part, our industry will rise to the challenge of driving thoughtful, ethical, and unassailable code to power the future. It’s all up to us.

codingzap_logo_1-768x768

  • Case Studies
  • Our Pricing
  • Do my Programming Homework
  • Java Homework Help
  • HTML Homework Help
  • Do my computer science homework
  • C++ Homework Help
  • C Homework Help
  • Python Assignment Help
  • Android Assignment help
  • Database Homework Help
  • PHP Assignment Help
  • JavaScript Assignment Help
  • R Assignment Help
  • Node.Js Homework Help
  • Data Structures Assignment Help
  • Machine Learning Assignment Help
  • MATLAB Assignment Help
  • C Sharp Assignment Help
  • Operating System Assignment Help
  • Assembly Language Assignment Help
  • Scala Assignment Help
  • Visual Basic Assignment Help
  • Live Java Tutoring
  • Python Tutoring
  • Our Experts
  • Testimonials
  • Submit Your Assignment

programming ethics assignment

Ethics In Seeking Programming Assignment Help Service

Programming Assignment Help Service

In this modern education world, students often turn to multiple online resources for “ethically seeking Programming Assignment help” . They engaged in such practice frequently as they encountered tons of programming assignments during their academic tenure.

However, after getting the required help, students incorporate those in their assignments without thinking about the ethics of doing so. Now, you might wonder what kind of ethics is involved in seeking programming assignment help especially if this concept is new to you.

In this article, we will discuss all the ethical points that a student should remember after asking for any kind of help with their programming assignments.

What Is Programming Assignment Help & Why Its Demand Is Increasing?

Different types of programming assignments are a part of any CS student’s life. They have to tackle different programming languages from C, C++, Java, Python, JavaScript, and many more. And it is obvious that in any assignment, most of the questions seem unfamiliar to them.

In such cases, they seek programming assignment help from different websites. In this case, a set of experts from a certain field completes all the assignments in a given time and submits them to the student. The student needs to pay the required amount for this service.

Nowadays, the demand for programming assignment help is increasing day by day. There are a few reasons behind this increasing demand. The reasons can be the following:

  • Plagiarism-free Solution: The programming assignment help services provide customized solutions that are plagiarism-free. So, one can easily use it in the answer sheet.
  • Extensive Learning: In programming assignment help, students come to know different other topics that are not included in their curriculum.
  • 1:1 Mentor: Whenever, there is a need to learn more, different Programming Assignment Help Services provide 1:1 mentorship that helps in continuous learning of the students.
  • Error-free Code: The codes that are generated by the programming assignment help services are 100% errorless.
  • Expert Guidance: If any students are facing any issue in their assignments, then the experts provide specific guidance to solve the question, even sometimes providing the solution as well.

What Are The Ethics In Seeking Programming Assignment Help?

First, we will discuss what are the ethics a student needs to follow while seeking help from any programming assignment help service. Later, we will discuss the unethical practices the students performed after receiving the help.

Ethics In Seeking Programming Assignment Help

After submitting the assignment to the website, many students perform an immoral practice that increases the trouble for the website.  We will make a few points that you should remember while seeking help for your coding assignment. The points are the following:

Stick To Your Deadline:

During the submission of your programming assignment, you have to put the deadline by which you need the required solution. It is a common practice to put way earlier time than the required submission date there. But the problem is not with that.

Occasionally, students reach out to the representative of the website and ask to change the deadline. They reduced the deadline by nearly half to increase the challenges of the experts who are working on the topic. It is not ethical to suddenly reduce the deadline for submitting the solution. 

Don’t Misuse The Reviewing Process:

Many programming assignment help services provide a feature to take the feedback from the student regarding the draft solution and some students misuse this feature. Though they find the draft solution well & acceptable, they push for any good approach to that same question.

In this way, experts provide a few sets of solutions to them. After wasting a few days, that student accepts the first set of solutions. It is a completely wastage of the experts’ time. And it is considered an unethical practice. We will advise you not to harass them as they are taking the amount for working. 

Treat As Your Faculty or Friend:

Most of the programming help service websites; acknowledge themselves as a friend to the students who are asking for any kind of help. And even, if there is a tight deadline, the experts put all efforts into providing an acceptable solution with no plagiarism.

But still, many students behave rudely. They have a mindset that as they are paying the required amount for completing the assignment, they can behave as they like. Whatever domain you are associated with, you have to be calm & respectful to others. And for a student, it suits more to them.

What Are The Ethics After Seeking Programming Assignment Help?

Now, we will discuss the unethical practices of students with programming assignment help services. We are not against taking help from different websites that provide programming assignment help services. We are against the unethical practice that a student performs after getting help. 

Such unethical practices will at last harm their future career path. And nobody wants failure in the future. Let’s start the Journey towards Ethical from Unethical Practices after Getting Programming Assignment Help.

Plagiarism and Coding:

The most unethical task a student performs to copying a solution from any online resources and pasting it into their answer sheet. This practice is mostly seen in the programming questions. As in such questions, students don’t try to understand the code & just copy it in the answer sheet.

In the case of the theoretical questions, students understand the writing & rephrase it into their language. And every academic institution is strict enough towards such plagiarism practices. Such practice might provide you zero marks in that course or even provide academic probation.

Instead of copying the entire code, understand the code & write from your own. If you are taking help from any Programming Assignment Help Service, then plagiarism issues will not occur there.

Focus On Collaboration More Than Cheating:

Students have to understand the difference between Collaboration and Cheating. Collaboration is the most appreciated work process. And every programming assignment can be solved with it. You can get help from any individual or expert. The developed solution will be marked as collaboration.

However, cheating is a process that undermines the contribution of others. You are glorifying your contribution more than any other one to get more marks on the scorecard. You are snatching the contribution of others by writing your name on it. Be honest about what you have developed. 

If it is developed in a collaborative environment, let it be. Don’t try to snatch the contribution of others by appreciating your contribution a bit more.

Properly Citing Resources and Help:

Taking help from different resources is not any kind of offense. Offense is made whenever you are sidelining those helps you have taken. You should be upfront about the assistance that you have received from various websites or individuals. Be more honest & transparent after taking any kind of help.

If you have developed a piece of code with the help of certain websites, you should add that website name to your assignment answer script. Just like in any final-year project documentation, we used to write the reference links at the end of the document. 

You can add those recourses to your answer script in the same manner. Such practice will help to build your image as an honest student who acknowledges the help that he has taken.

Balancing Between Getting Help and Learning:

Seeking any kind of help should not mean that a student is trying to bypass the learning effort. One thing, we should all remember is that, everybody doesn’t come to the earth with every knowledge. With continuous studying & getting experience daily, our knowledge flourished.

Whatever the solution you are taking as the help, learn that solution as well. We recommend you not just put the solution by giving the credit to certain resources. Understand the solution that is given there. If needed you should ask the certain one who has provided the answer.

Such practice will help to balance between getting help and learning. And you will not get tilted towards getting help & pasting process from any certain resources.

Knowledge Is Not A Material For Reselling:

One last thing that should be enlightened is that as a student you should not resell your knowledge. Whatever the knowledge you are getting from any certain resources, don’t resell it to any other fellow student for getting any kind of amount.

You can share the knowledge free of cost if you are willing to do so. But to get back your invested amount for that knowledge, you have started reselling it, is a completely unethical task. You can declare the source name and let others enjoy the same knowledge.

Don’t run behind the money in your student life. It is the time to develop honesty and ethical practices. Such good things will reflect in your future career path which will come as a blessing for you.

How To Find The Right Programming Help Service For You?

Now, after clarifying the process for ethically seeking programming assignment help service, we would like to end with one bonus note. Here, we will discuss the way to find the right programming assignment help service that suits you the best.

Programming Assignment Help

On the internet, there are thousands of programming assignment help services are available. But to pick up the right one, you should recall some important points. Such points are the following:

  • Presence of Service: Check the programming services of the website. If they are providing service of your required programming language, then you are good to go.
  • Best Pricing: From the enormous number of services, filter out those assignment help services that provide the best price. And the price that suits your budget is the best.
  • Experts’ Review: Check the reviews of that help service. Find out what other customers are saying about their experts’ quality & service workflow.
  • Punctuality: Get the data on how many programming assignments, the help service has submitted to date within the proper deadline. It will help to pick up the right service.
  • Communication Means: Always go to the help service that provides a lot of communication means. Like Social Media, Personal Chats, Emails, etc.

Conclusion:

As we saw “Ethically Seeking Programming Assignment Help” is a necessity to develop a good personality.

Proper Ethics should be followed while taking the help along with after taking the programming assignment help. You should not copy any item or piece of code without providing proper credit to the contributor. Be honest & transparent about what you are working on.

It’s a good idea to grasp the basic concepts of any programming language. When you have a strong foundation, you’ll find yourself needing less programming assignment help. A solid understanding of the basics will help to tackle more complex challenges.

In case you require any assistance with coding or programming homework assignments, you can also check out CodingZap’s top-notch services. So, visit our website today! 

Sounetra Ghosal

Sounetra Ghosal

Leave a comment cancel reply.

Your email address will not be published. Required fields are marked *

CodingZap white Logo

CodingZap is founded back in 2015 with a mindset to provide genuine programming help to students across the globe. We cater to a broad range of programming homework help services to students and techies who are struggling with their code.

Programming Help Expertise

Contact us now.

  • HQ USA: 920 Beach Park Blvd, Foster City, USA
  • +1 (332) 895-6153
  • [email protected]

CodingZap accepts all major Debit and Credit cards payment.

Important Links

Copyright 2015-2024 CodingZap Technologies Private Limited- All rights reserved.

COMPUTERS ETHICS  SOCIETY and HUMAN VALUES   

===========================

Module Overview: 0 ORIENTATION Some simple steps to follow under ASSIGNMENTS that will enable the students to orient themselves to the class website and to the BlackBoard website for the class.

Module Learning Objectives:   The learner will :

  • Explore  the BlackBoard Website
  • Explore the class website
  • Learn to locate the calendar, assignments, and textbook
  • Understand the Course Objectives and assignments
  • Understand the Grading System and what is expected of the student
  • Practice sending an email to the instructor
  • Learn what GROUP you are in by clicking on the GROUP button in the class website or the BlackBoard site and Enter the GROUP Area in BlackBoard and begin discussions and select a group leader.   Share what you are good at doing and your ideas about the three types of group work you will be doing and how you might contribute.  The group leader should emerge out of those discussions.

Required Readings: ALL COURSE INFORMATION DOCUMENTS on the CLASS WEBSITE

Assignments: check the calendar

Discussions:   check the calendar

=============================================================================

Module Overview: MODULE 1 INTRODUCTION TO THE COURSE

This is the first module of the course. in this module you will do a bit of reading and then enter into the discussions and send in a written assignment and start working within your group.

Module Learning Objectives:  The Student will learn about:

  • the range of topics and issues covered in the class.
  • the history of Computing
  • the historical development of Computers and Ethics

Required Readings:   ONLINE TEXTBOOK   Chapter 1.

Suggested Readings:

chapter 1,2  Hester, D. Micah & Paul J. Ford, eds  Computers and Ethics in the Cyberage (Prentice Hall: Upper Saddle River, NJ, 2001)

Discussions:  check the calendar

Module Overview: MODULE 2 COMPUTERS and ETHICS

This module introduces the particular ethical issues associated with computers and information systems

  • the Ethical Issues associated with Computers
  • what is unique about computers and information systems as far as ethics and morality
  • approaches to the problems and issues

Required Readings:   ONLINE TEXTBOOK Chapter 2

chapter 1 Johnson, Deborah Computer Ethics , 3rd ed (Prentice Hall: Upper Saddle River, NJ, 2001)

Module Overview: MODULE 3 ETHICS

A presentation of several of the major ethical traditions or schools of thought.

Module Learning Objectives: The learner should :

  • learn about how humans become moral
  • distinguish legality from morality
  • identify problems with ethical relativism
  • identify the basic ethical traditions
  • understanding of the basic principle of the GOOD for each tradition
  • understanding of the weaknesses of each of the traditions
  • understanding how to approach thinking about moral problems uses dialectical thinking

VIDEO : Objective of Module Three at  http://www.youtube.com/v/YIoUj1zFM94

Required Readings:    ONLINE TEXTBOOK   Chapter 3

Suggested Readings:  chapter 2 Johnson, Deborah Computer Ethics , 3rd ed (Prentice Hall: Upper Saddle River, NJ, 2001)

Module Overview : MODULE 4 Law: Free Speech and Censorship

A presentation of the matter of Freedom of Speech as it applies to issues related to computers and the internet and information systems and networks.

Module Learning Objectives: The learner will

  • understand the basic concept of freedom of speech
  • relate freedom of speech to the internet
  • analyze cases related to freedom of speech and the internet
  • understand the issues involved with the CDA and its relation to freedom of speech and the need to protect classes in society

Required Readings:   ONLINE TEXTBOOK Chapter 4

Suggested Readings: 

chapter  3 Baase,Sara, A Gift of Fire . 2nd ed. (Prentice Hall: Upper Saddle River, NJ, 2003)

chapter 8  Johnson, Deborah Computer Ethics , 3rd ed (Prentice Hall: Upper Saddle River, NJ, 2001)

Module Overview: MODULE 5 Intellectual Property

Presenting and discussing matters of Intellectual property and the foundations for the concept and the basis for the right to control IP in a world with digital information and software.

Module Learning Objectives:  The learner will  be able to answer the following questions:

  • What sort of property is IP ?
  • How is software IP?
  • What is the basis for a claim to IP?
  • What are the threats to IP posed by Computer Technology and Information Technology/

Required Readings:   ONLINE TEXTBOOK Chapter 5

chapter 6 Johnson, Deborah Computer Ethics , 3rd ed (Prentice Hall: Upper Saddle River, NJ, 2001)

chapter  8 Hester, D. Micah & Paul J. Ford, eds  Computers and Ethics in the Cyberage (Prentice Hall: Upper Saddle River, NJ, 2001)

chapter  4 Baase,Sara, A Gift of Fire . 2nd ed. (Prentice Hall: Upper Saddle River, NJ, 2003)

Module Overview: MODULE 6 Privacy

A presentation on the matter of Privacy and its forms in : communication privacy  and information privacy and psychological privacy.

Module Learning Objectives: The learner will be able to answer these questions

  • What is the need for privacy?
  • How does Privacy relate to democracy?
  • How is privacy to be considered in relation to the Social Good?

Required Readings:   ONLINE TEXTBOOK Chapter 6

chapter 5  Johnson, Deborah Computer Ethics , 3rd ed (Prentice Hall: Upper Saddle River, NJ, 2001)

chapter 7 Hester, D. Micah & Paul J. Ford, eds  Computers and Ethics in the Cyberage (Prentice Hall: Upper Saddle River, NJ, 2001)

chapter  2 Baase,Sara, A Gift of Fire . 2nd ed. (Prentice Hall: Upper Saddle River, NJ, 2003)

Module Overview: MODULE 7 Secrecy and Security

 A presentation on matters of secrecy of information and the protection or security of that information.

Module Learning Objectives: The learner will answer these questions

  • What needs are there to protect information held in computers and information systems?
  • How is security to be provided?
  • How much security?
  • When is it necessary to violate security?

Required Readings:   ONLINE TEXTBOOK Chapter 7

chapter 3  Baase,Sara, A Gift of Fire . 2nd ed. (Prentice Hall: Upper Saddle River, NJ, 2003)

Assignments : check the calendar

Module Overview: MODULE 8 Mid Course Evaluations and Changes

 A week to take stock. A week for a small rest. A week to give you an opportunity to revise papers and to revise the approach to the course as well as an attempt to improve the learner's performance and final grade.

Your group might submit a draft or description of what you are doing for the instructor's feedback during this week.

Evaluations of the course , the subject matter and student performance will take place during this module.

Module Learning Objective:

Students will discuss what problems they are encountering with the course and make suggestions as to how to improve the course and their performance in the course.

Students will become more aware of what is required to do well in the course.

Required Readings:   None

Assignments:   None

Discussions: Several.  They are the only feature of this module.

Just respond in the Discussion Board to the lead questions concerning the class and how it is going.

Module Overview: MODULE 9 Crime and Misbehavior

A presentation on a variety of crimes and troublesome behavior associated with computer and information technologies and the Internet.

Module Learning Objectives:  The learner will answer these questions

  • What is the differences amongst acts that are poor etiquette or netiquette and those that are illegal and those falling in between and disruptive.
  • What are the available responses to the variety of behaviors?
  • What are the responsibilities of computer professionals?

Required Readings:  ONLINE TEXTBOOK Chapter 8

chapter  4 Johnson, Deborah Computer Ethics , 3rd ed (Prentice Hall: Upper Saddle River, NJ, 2001)

chapter  7 Baase,Sara, A Gift of Fire . 2nd ed. (Prentice Hall: Upper Saddle River, NJ, 2003)

Module Overview: MODULE 10 Information Technology  Accountability

A presentation on the concept of accountability.

  • What is the difference in accountability , responsibility, liability and blame?
  • What are the forms of responsibility?
  • What is the problem of Diffusion of Accountability?
  • What is the Myth of Amoral Computer Programming?
  • How are professionals and non-professionals to deal with the matter of responsibility?

Required Readings:  ONLINE TEXTBOOK Chapter 9

chapter 7 Johnson, Deborah Computer Ethics , 3rd ed (Prentice Hall: Upper Saddle River, NJ, 2001)

Module Overview: MODULE 11  Computing and Information Technology as Professions and Professional Codes

  A presentation on Computing and Information Technology and Professionalism

Module Learning Objectives:  The learner will answer these questions

  • What is a profession?
  • Is computing a profession?
  • Is software engineering a profession?
  • What are the special responsibilities of Professionals?
  • What are the Professional Codes and how do they operate?

Required Readings:   ONLINE TEXTBOOK Chapter 10

chapter  3 Johnson, Deborah Computer Ethics , 3rd ed (Prentice Hall: Upper Saddle River, NJ, 2001)

chapter 6 Hester, D. Micah & Paul J. Ford, eds  Computers and Ethics in the Cyberage (Prentice Hall: Upper Saddle River, NJ, 2001)

chapter  10  Baase,Sara, A Gift of Fire . 2nd ed. (Prentice Hall: Upper Saddle River, NJ, 2003)

Module Overview: MODULE 12 Social Change

A presentation on the impact of Computer Technology and Information Technology on Society

  • What has been the impact on forms of expression and freedom of expression?
  • What has the technology done related to the existence of the individual as an autonomous being ?
  • What have been the other major changes resulting from the technologies?

Required Readings:  ONLINE TEXTBOOK Chapter 11

chapter  8  Johnson, Deborah Computer Ethics , 3rd ed (Prentice Hall: Upper Saddle River, NJ, 2001)

chapter  11 12 Hester, D. Micah & Paul J. Ford, eds  Computers and Ethics in the Cyberage (Prentice Hall: Upper Saddle River, NJ, 2001)

chapter  8,9 Baase,Sara, A Gift of Fire . 2nd ed. (Prentice Hall: Upper Saddle River, NJ, 2003)

Module Overview: MODULE 13 Political Change

A presentation on the impact of Computer Technology and Information Technology on Political Institutions

  • What has been the impact of the technologies on the forms of expression and they on Democracy?
  • What has the technology to offer to support Democracy ?
  • What has the technology to offer to threaten  Democracy?
  • What is the Digital Divide and its variations and how do they relate to the political process?

Required Readings:  ONLINE TEXTBOOK Chapter 12

chapter 5   Hester, D. Micah & Paul J. Ford, eds  Computers and Ethics in the Cyberage (Prentice Hall: Upper Saddle River, NJ, 2001)

Module Overview: MODULE 14 Artificial Intelligence: Computers and Being Human

  A presentation on the impact of Computer Technology and Information Technology on how humans regard themselves as human beings

  • How does artificial intelligence lead humans to rethink their humanity?
  • How much should computers be trusted in making decisions for humans?
  • How do the technologies lead humans to think about responsibility?
  • What do the technologies present as a challenge for humans in so far as their basic humanity?

Required Readings:   ONLINE TEXTBOOK Chapter 13

chapter  4,10 Hester, D. Micah & Paul J. Ford, eds  Computers and Ethics in the Cyberage (Prentice Hall: Upper Saddle River, NJ, 2001)

chapter  4   Baase,Sara, A Gift of Fire . 2nd ed. (Prentice Hall: Upper Saddle River, NJ, 2003)

Module Overview: MODULE 15:    CULMINATING ACTIVITY: BONUS EXERCISE

In this Culminating Activity module, you will complete a survey and enter into discussions that will provide feedback to the instructor to be used in the evaluation and revision of the course for the next time it is taught.

Module Learning Objectives:

The objective is to provide valuable feedback to the instructor.

= ============================================================

Click the back button on your browser to return to the previous document.

University of Michigan Athletics

Women's basketball generic pregame shoes

Big Ten Releases U-M's Conference Assignments for 2024-25

5/7/2024 2:00:00 PM | Women's Basketball

By: Sarah VanMetre

ROSEMONT, Ill. -- In conjunction with the Big Ten Conference, the University of Michigan women's basketball program announced Tuesday (May 7), the home and away designations for the upcoming 2024-25 conference season. The conference season will still consist of 18 games with the additions of the four West Coast schools.

Each team will play 16 single-game opponents (eight home/eight away) and one home and away opponent. Michigan will face Michigan State in its one home-and-home series.

U-M's home opponents are Indiana, Iowa, Michigan State, Northwestern, Ohio State, Penn State, Rutgers, Washington and Oregon. Michigan will be on the road for games at MSU, Illinois, Maryland, Minnesota, Nebraska, Purdue, Wisconsin, UCLA and USC.

Michigan is coming off its sixth straight NCAA Tournament berth in 2023-24, reaching the 20-win mark for the 11th time under J. Ira and Nicki Harris Family Head Women's Basketball Coach Kim Barnes Arico .

Season tickets are now on sale for the 2024-25 season and start at just $75. Purchase your season tickets here and don't miss a moment of the action.

2024-25 Michigan Big Ten Opponent Breakdown

Indiana

IMAGES

  1. Ethics Code Introduction

    programming ethics assignment

  2. PPT

    programming ethics assignment

  3. Ethics Assignment Final.pdf

    programming ethics assignment

  4. Model Code of Ethics for Educators Assignment

    programming ethics assignment

  5. Programming Ethics

    programming ethics assignment

  6. 😱 Code of ethics assignment. Code Of Ethics Assignment: Evaluation Of

    programming ethics assignment

VIDEO

  1. MMPC -020 BUSINESS ETHICS AND CSR ALL BLOCKS 2024 IGNOU ASSIGNMENT

  2. Ethics Assignment 6

  3. Programming Class Video 4-15-24

  4. Individual Assignment DTT3023 Ethics and Professional Issues

  5. Assignment 1C Ethics and International Finance

  6. Ethics Assignment Number 6

COMMENTS

  1. Stanford Embedded Ethics

    Embedding Ethics in Computer Science. This site provides access to curricular materials created by the Embedded Ethics team at Stanford for undergraduate computer science courses. The materials are designed to expose students to ethical issues that are relevant to the technical content of CS courses, to provide students structured opportunities ...

  2. A new program at Stanford is embedding ethics into computer science

    Rather than have ethics be its own standalone seminar or dedicated class topic that is often presented at either the beginning or end of a course, the Embedded EthiCS program aims to intersperse ethics throughout the quarter by integrating it into core course assignments, class discussions, and lectures.

  3. A new initiative seeks to integrate ethical thinking into computing

    Stanford launches an embedded EthiCS program to help students consistently think through the common issues that arise in computer science. October 9, 2020 ... Each module will include at least one lecture and one assignment that grapples with ethical issues relevant to the course. But Creel says she and her collaborators are also working on ...

  4. New program embeds ethics into computer science courses

    Stanford launches an embedded EthiCS program to help students consistently think through the common issues that arise in computer science. Students were then tasked to provide recommendations to ...

  5. PDF Integrating Ethics into Introductory Programming Classes

    programming is the perfect place to begin ethics integration. Our intervention focused on replacing existing assignments with new assignments contextualized with ethical dilemmas and concepts. 3 COURSE DESCRIPTIONS In the 2019/2020 academic year, we piloted our approach in three different introductory programming courses at University of Col-

  6. Building an Ethical Computational Mindset

    Stanford's embedded ethics program will ensure that more students understand the importance of ethics in a technological context and signal that ethics is integral to their work. Technology is facing a bit of a reckoning. Algorithms impact free speech, privacy, and autonomy. They, or the datasets on which they are trained, are often infused ...

  7. PDF 'This Applies to the Real World': Student Perspectives on Integrating

    ing students on their attitudes towards ethics based assignments [10], in order to gather more detailed feedback that both provided guidance for iterating on the assignment, and ideas that will help ... integrate ethics into programming assignments more broadly, we drewfromourownassignment creationprocess,startingwithlist-ingsocial issues ...

  8. Harvard works to embed ethics in computer science curriculum

    Curriculum at a glance. A sampling of classes from the Embedded EthiCS pilot program and the issues they address. Great Ideas in Computer Science: The ethics of electronic privacy Introduction to Computer Science II: Morally responsible software engineering Networks: Facebook, fake news, and ethics of censorship Programming Languages: Verifiably ethical software systems

  9. Integrating Ethics into Computer Science Education

    When applied to computer ethics education, this framework is orthogonal to the structure and content of the initiative, as I illustrate using examples of dedicated ethics courses and embedded modules. It therefore highlights additional features of cross-disciplinary teaching that need to be considered when planning a computer ethics programme.

  10. Integrating Ethics into Introductory Programming Classes

    This paper presents one approach to ethics integration into such classes: assignments that teach basic programming concepts (e.g., conditionals or iteration) but are contextualized with real-world ethical dilemmas or concepts. ... Effective incorporation of ethics into courses that focus on programming. ACM SIGCSE Bulletin, Vol. 37, 1 (2005 ...

  11. Computing, Ethics, and Society

    Computing, Ethics, and Society Foundations. Course 1 • 24 hours. Identify and manage ethical situations that may arise in their careers. Understand and be able to apply ethical frameworks to help them analyze ethical challenges. Identify some of the main ethical issues that arise in the use of the internet, including privacy, security, and ...

  12. ACM Code of Ethics and Professional Conduct

    The ACM Code of Ethics and Professional Conduct ("the Code") expresses the conscience of the profession. The Code is designed to inspire and guide the ethical conduct of all computing professionals, including current and aspiring practitioners, instructors, students, influencers, and anyone who uses computing technology in an impactful way.

  13. Uncovering Ethical Concerns in Programming

    Uncovering Ethical Concerns in Programming. This post is part of a series following the progress of Ethics Lab's collaboration with the Computer Science Department that began with the Mozilla-sponsored ResponsibleCS challenge. On day one of Dr. Ray Essick's Advanced Programming class, 85% of students in the course either agreed or strongly ...

  14. Unethical and illegal practices in coding

    Sophie Vande Kerkhove. Cases of programming practices being ...

  15. Embedding ethics in computer science curriculum

    Under the initiative, dubbed Embedded EthiCS, philosophy graduate students are paired with computer science faculty members. Together, they review the course material and decide on an ethically rich topic that will naturally arise from the content. A graduate student identifies readings and develops a case study, activities, and assignments ...

  16. (PDF) "This Applies to the RealWorld": Student Perspectives on

    integrate ethics into programming assignments more broadly, we drew from our own assignment creation process, starting with list- ing social issues the assignment could be based on.

  17. 'This Applies to the Real World': Student Perspectives on Integrating

    ethics-oriented assignment concepts alongside students. Deriving from tech controversies that participants felt most affected by, we created a bank of ideas as a starting point for further curriculum development. CCS CONCEPTS • Social and professional topics; KEYWORDS ethics, introductory programming, CS1, social impact, assignments,

  18. Programming ethics

    This article gives an overview of professional ethics as applied to computer programming and software development, in particular the ethical guidelines that developers are expected to follow and apply when writing programming code (also called source code), and when they are part of a programmer-customer or employee-employer relationship.These rules shape and differentiate good practices and ...

  19. What It Takes To Be an Ethical Programming Professional

    Understand What Ethics Is (and What It Isn't) As programmers, we all started somewhere. For some of us, it was tinkering with bits of code and dabbling in online coursework to learn how things worked. For others, it was in a university, struggling with assignments and working toward a computer science degree. No matter the path, the one ...

  20. Code of Ethics

    Ethical and social computing are embodied in the ACM Code of Ethics. The core values expressed in the ACM Code inspire and guide computing professionals. The actions of computing professionals change the world, and the Code is the conscience of the field. Serving as the Hippocratic Oath for the IT Professional, the Software Engineer, the Programmer and all those responsible for shaping and ...

  21. Programming Ethics

    Before you get to programming there are programming ethics you need to understand. Not just understand but follow them. ACM: The ACM is the professional society for the professional computer scientists.They have framed a set of code and ethics any computer professional should live by. ... Dont have your friend or spouse write your assignments ...

  22. Ethics In Seeking Programming Assignment Help Service

    As we saw "Ethically Seeking Programming Assignment Help" is a necessity to develop a good personality. Proper Ethics should be followed while taking the help along with after taking the programming assignment help. You should not copy any item or piece of code without providing proper credit to the contributor.

  23. COMPUTERS_ETHICS_MODULE_OVERVIEWS

    Module Overview: MODULE 4 Law: Free Speech and Censorship. A presentation of the matter of Freedom of Speech as it applies to issues related to computers and the internet and information systems and networks. Module Learning Objectives: The learner will. understand the basic concept of freedom of speech.

  24. Big Ten Releases U-M's Conference Assignments for 2024-25

    Big Ten Releases U-M's Conference Assignments for 2024-25. 5/7/2024 2:00:00 PM | Women's Basketball. Share: By: Sarah VanMetre. ROSEMONT, Ill.-- In conjunction with the Big Ten Conference, the University of Michigan women's basketball program announced Tuesday (May 7), the home and away designations for the upcoming 2024-25 conference season ...