It’s Time for State Boards of Accountancy to Move to the 120/150 Educational Requirement for Licensing as a CPA
The Story of FTX and Sam Bankman-Fried

Words of Caution for Educators Using ChatGPT

A Mixed Reaction From Students

It is becoming clear that ChatGPT is not a fad and will continue to be used increasingly in colleges and universities. The use of ChatGPT and other AI in education can be met with resistance because its use can walk a fine line between questionable integrity and employing a valuable educational tool. Generative AI systems like ChatGPT can give inaccurate or misleading results because of prompts that are too vague but also from poor data sources. The limitation of the technology means it can experience problems on relatively simple queries. 

Ethical risks include a lack of transparency, erosion of privacy, poor accountability, and workforce displacement and transitions. The question is whether AI systems should be trusted. To build trust through transparency, organizations should clearly explain what data they collect, how it is used, and how the results affect customers.

Data security and privacy are important issues to consider in deciding whether to use ChatGPT. Given its access to vast amounts of data including sensitive financial information, there is a risk that this data could be compromised, either through hacking or other means. Proper security measures should be in place to protect the data from unauthorized access.

Views of Students

In “1 in 4 companies have already replaced workers with ChatGPT,” survey results reported by Resume Builder and updated on March 27, 2023 show that ChatGPT, or similar AI applications, are increasingly being used by college students. The results include the following:

  • 47% of college students have used ChatGPT or a similar AI application.
  • 50% of those who have used ChatGPT have used it to complete assignments or exams (22% of all college students in the survey).
  • 57% do not intend to use it or continue using it to complete their schoolwork.
  • 31% say their instructors, course materials, or school honor codes have explicitly prohibited AI tools.
  • 54% say their instructors have not openly discussed the use of AI tools like ChatGPT.
  • 60% report that their instructors or schools haven’t specified how to use AI tools ethically or responsibly.
  • 61% think AI tools like ChatGPT will become the new normal.
  • 51% agree that using AI tools to complete assignments and exams counts as cheating and plagiarism while 20% disagree. The rest are neutral.
  • 40% say that using AI defeats the purpose of education.                                                                                                                    

Educator Views

Educators must make it clear to students exactly what are their expectations regarding student use of ChatGPT. It is important to be explicit and transparent about the limitations of its use. A statement to that effect should be included in the course syllabus, such as whether it can be used for assignments and research papers. Universities should have clear policies that address the use of ChatGPT in the classroom and by students in completing assignments and exams. Academic integrity must underlie these policies.

The main issue for educators is that ChatGPT has the potential to facilitate cheating by students without being detected. This has implications for academic integrity and could erode critical thinking skills and undermine the fundamental values of higher education.

Some educators insist that using ChatGPT is unethical, in part because each time the same question is posed, it produces somewhat unique answers which makes detection difficult. Others insist that it is simply another form of technology, like Google or Wikipedia, and it increases efficiency in obtaining information. Dave Epstein states that technology can go wrong, pick up unreliable or even wrong material and produce it as fact. He cautions that the output from ChatGPT must be evaluated and a judgment made whether it is true and reliable and useful for the question to be answered.

Not all educators believe that using ChatGPT is a bad thing. Writing for Scientific American, John Villasenor tells his students in his class at the UCLA School of Law that they can use ChatGPT in their writing assignments. He contends that the time when a person had to be a good writer to produce good writing ended in 2022, and educators need to adapt. He suggests that rather than banning students from using labor-saving and time-saving AI writing tools, educators should teach students to use them ethically and productively. BLOG

What Are Educators Doing

Some professors are phasing out take-home, open-book assignments — which became a dominant method of assessment in the pandemic but now seem vulnerable to chatbots. They are instead opting for in-class assignments, handwritten papers, group work and oral exams.

Generative AI systems like ChatGPT can give inaccurate or misleading results because of prompts that are too vague but also from poor data sources. The limitation of the technology means it can experience problems on relatively simple queries. Educators must consider that the bot might provide two different answers to the same query, and the answers may even change over time.

Data security and privacy are important issues to consider in deciding whether to use ChatGPT. Given its access to vast amounts of data including sensitive financial information, there is a risk that this data could be compromised, either through hacking or other means. Proper security measures should be in place to protect the data from unauthorized access.

Implications for Education

The main issue for educators is that ChatGPT has the potential to facilitate cheating by students without being detected. This has implications for academic integrity and could erode critical thinking skills and undermine the fundamental values of higher education.

Some professors are phasing out take-home, open-book assignments — which became a dominant method of assessment in the pandemic but now seem vulnerable to chatbots. They are instead opting for in-class assignments, handwritten papers, group work and oral exams.

ChatGPT can facilitate using advanced teaching methodologies, promote interactive learning, and develop students’ critical thinking skills. ChatGPT can be used to solve complex problems, generate summaries and reports, make recommendations, and conduct data analysis. However, the bot may not collect information from reliable sources. Hence, the information may be outdated, incorrect, or biased.

Clarify Expectations

Educators must make it clear to students exactly what are their expectations regarding student use of ChatGPT. It is important to be explicit and transparent about the limitations in its use. A statement to that effect should be included in the course syllabus, such as whether it can be used for assignments and research papers.

Educators must be sensitive to possible misinformation and biases that can be taint the reliability of information provided by ChatGPT. They should be aware of possible overreliance on AI-generated content that can inhibit critical thinking and creativity, and lead to plagiarism and other violations of academic integrity.

Educators should be aware that ChatGPT is here to stay. Rather than banning it, educators should find ways to incorporate it into the curricula. ChatGPT is here to stay.

Blog posted by Steven Mintz, PhD on October 26, 2023. Find out more about Steve’s professional activities on his website (https://www.stevenmintzethics.com/). You can sign up for his newsletter and connect on LinkedIn (https://www.linkedin.com/company/ethics-sage/about/.)

Comments