Photo by Glenn Carstens-Peters on Unsplash
With the rapid development and widespread use of Artificial Intelligence programmes over the past couple of years, it’s been necessary for us at The Carbon Literacy Project to develop a response to its usage within evidence submission. We understand that AI can be used in a variety of different ways; as a research tool, a writing tool and to find answers to questions. We support the appropriate use of software that can help learners communicate their own thoughts. However, to protect the integrity of the assessment we do need to exclude the use of AI to simply generate the content of their submission.
Application of AI in Evidence Generation
Spelling and Grammar
Programmes such as – Grammarly
Spelling and grammar tools can help learners to feel more confident with their writing, and makes suggestions in a similar way to how programmes such as Microsoft Word’s in-built spelling and grammar check works. This sort of tool may be of particular use to those who are dyslexic.
Speech-to-Text
Programmes such as – Dragon NaturallySpeaking
Learners with challenges such as dyslexia and physical difficulties may find it helpful to use a speech to text programme where they are able to dictate their responses rather than writing or typing them. This can make it easier to express their ideas and removes a potential barrier to engaging in the assessment process.
Content Generation
Programmes such as – ChatGPT
Content generation tools are trained to follow an instruction or question prompt and provide a detailed ‘crowdsourced’ response, drawing upon ‘sources’ originally used to train the AI and/or those found across the internet.
Conclusion
Carbon Literacy is not a test of a learners’ writing or language skills, and so there are some AI-derived programmes which we believe can improve accessibility for learners. However, there are other programmes that make it difficult to assess a learner’s individual understanding and application of the concepts included in Carbon Literacy training, and which we actively discourage the use of.
Tools such as those for Spelling and Grammar or Speech-to-Text can help learners with a variety of challenges get their thoughts and ideas across without compromising our ability to understand what a participant has learnt and how that has translated into action. We want the submission of evidence to be as inclusive as possible for all learners whilst maintaining a robust and consistent process for capturing actions that are made by learners, so see no issues with usage in these scenarios.
However, we are not comfortable with learners using content generation programmes as part of their Carbon Literacy evidence submission. We are looking for original content from learners to demonstrate their ‘Carbon Literacy’. We ask learners to draw on their learning from their Carbon Literacy training session, and use this directly to come up with their own carbon-reduction solutions, in their sphere, and explain why this will be impactful and significant. If content generation programmes are used during submission of evidence, it becomes impossible for our team to assess the learner.
In cases where these programmes have been used, it becomes more about what the AI programme knows about the topic, which is not what we are assessing. We use AI content-generation detection software if we suspect that these types of tools have been used by a participant with little or no learner input into the response or idea. It is for these cases The Carbon Literacy Project will ask for evidence to be resubmitted, and in some instances will ask for a short conversation with the learner (via Zoom) to gauge how much the learner has understood their CL training.
Further Notes
We encourage trainers to highlight any additional learning needs for a particular group of learners, or specific learner, so that we can take this into consideration when reviewing evidence submissions. We also provide an opportunity within the Evidence Form for learners to highlight anything they would like us to take into consideration during the evidence review process. This can be found on page 2 of our Participant Details and Evidence Form, found here, where it states “Use the space below to write anything that our certification team should be aware of when processing your application”. This helps our team make an accurate and informed decision. We meet learners where they are and reasonably adjust requirements depending on a number of factors relating to the individual and/or group; including – ability, age, opportunity and influence. It is for this reason Carbon Literacy training is so adaptable and can be used as a learning tool for a variety of audiences across society.
We would always encourage a trainer to have a conversation with us if, at any point, they are concerned their learners will struggle with the evidence submission process. We would like to continue to improve accessibility of Carbon Literacy for learners, and are happy to work with trainers to create bespoke solutions where necessary.
As AI programmes and tools continue to be developed, The Carbon Literacy Project will continually review our assessment processes to ensure we are being as fair and inclusive as possible.