The evolving capabilities of Generative Artificial Intelligence tools present some exciting opportunities as well as challenges for scholarly research. Used well, such tools can help facilitate certain aspects of the research process. However, they can also stifle creativity, interfere with the learning process, impede critical thinking and expose students to allegations of academic misconduct. The Anthropology Department is committed to providing students with the skills needed to operate effectively in an AI-enabled world, while exercising sound academic judgement and upholding rigorous standards for ethical scholarly practice. To this end, the Department endorses a “limited use” policy for students, unless otherwise specified by individual course leaders.
Using generative AI at LSE: guidance for the Department of Anthropology and for students registered in any AN course
Generative AI tools can analyse data, create writing, computer code, and / or images using minimal human prompting. Over the last year, generative AI tools have proliferated and are now embedded in everyday software and services.
As a student in the Department of Anthropology – or taking an Anthropology course – it’s important that you understand how you can use generative AI in your assessments, so that you can do your best in your studies.
As a pedagogically diverse School with different disciplines, there is no one approach to the use of generative AI across LSE. Instead, the School has developed central guidance that allows different approaches to be taken by different academic departments and course convenors. You should familiarise yourself with the School's guidance, which is available here. Within the School's guidance, there are three positions to know about:
- Position 1: No authorised use of generative AI in assessment.
- Position 2: Limited authorised use of generative AI in assessment
- Position 3: Full authorised use of generative AI in assessment.
In Anthropology we follow
- Position 2: Limited authorised use of generative AI in assessment
Position 2
This means that generative AI tools can be used in specific ways for assessments in the Department of Anthropology.
You will need to carefully read through the permitted use of generative AI tools in the Department of Anthropology. Other courses you are taking may have different policies on generative AI usage: this applies only to courses taken in the Department of Anthropology.
The ways in which the use of generative AI tools are permitted, as well as where you cannot use them in the Department of Anthropology, are outlined below:
- When preparing for Anthropology assignments, you may use generative AI tools for assisting with gathering information from across sources and assimilating it for your understanding. Generative AI tools may be used to assist with searches for relevant literature, to assist in your understanding of terms or concepts, and to direct further research.
- When writing the text for Anthropology coursework assignments (including formative work, summative coursework essays, and take-home assessments), the only generative AI tools allowed are those tools that assist with grammar and spell-checking.
- It is not permitted to use text produced by any generative AI system in any formative or summative assessment submission. In other words, all work submitted in Anthropology (AN courses) must be written solely by the student. With the exception of spelling and grammar, generative AI tools should not be used to alter or improve the content of draft texts or argumentation.
- You cannot use any generative AI tools for in-person Anthropology exams taken in an exam hall.
- The product of a generative AI search cannot be quoted or relied on in any formative or summative work as a source of supporting statements and arguments made in the assessment.
Although the Anthropology Department allows limited use of generative AI, as detailed above, when relying on the result of such tools, students must exercise care and apply independent scholarly judgment. AI-generated content tends to be generic and can often contain errors or be incomplete or biased in different ways. Students should also be aware that the use of generative AI at the research and planning stage can limit creative and critical thinking and impede the development of original ideas. It can also detrimentally interfere with the Department’s requirement that students extensively engage with course readings and other materials (including lectures and seminar discussions), as indicated in the department’s grade criteria.
If you have any questions about the use of generative AI tools for coursework, please speak with your course teacher.
In all submissions where you can use generative AI other than for grammar or spelling, you must cite its usage. Failing to cite the use of generative AI and using generative AI for any academic work where its use is not permitted is a form of academic misconduct.
Generative AI at LSE: resources and tools
As a student at LSE, you have access to Microsoft’s generative AI tool, Copilot, for free. Please ensure you understand how to use Copilot effectively and responsibly in your studies, using our guidance.
To help you further develop your understanding, you can attend a workshop run by the Digital Skills Lab on maximising the benefits and mitigating the risks of Copilot and there is also a Moodle course on developing your AI literacy.
The Copilot workshop and AI literacy Moodle course can also help prepare you for the world of work, by ensuring you have the generative AI skills you’ll need to support your future career aspirations.