The Washington state Office of Superintendent of Public Instruction (OSPI) will soon issue guidance embracing artificial intelligence as an educational tool, according to Washington State Superintendent Chris Reykdal.
Reykdal announced the news Wednesday in his annual state of the K-12 education system address.
“We are going to lead the nation and the world in this,” Reykdal said. “Our guidance will be a model for the country. It’s nearly done and will be out soon.”
So far, only Oregon and California have issued guidance around AI and education. With this new guidance from OSPI, Washington will be at the forefront of seeing AI as “a tool for critical thinking,” Reykdal said in an interview.
This new guidance could mark a shift in how educators view AI’s role in schools and on education.
There are fears from Washington state educators about how AI can increase cheating and plagiarism. However, new studies show that OpenAI’s ChatGPT tool didn’t increase cheating in high schoolers.
Some school districts, such as the Edmonds School District, have cautiously embraced the use of ChatGPT and are training staff and working the technology into classrooms.
“Young people are using [these tools], some educators are using them, and everyone is fearing them,” Reykdal said.
This is why he and the OSPI team are working quickly to issue this new guidance.
“We’re going to help young people understand it’s not a tool to be feared,” Reykdal said. “It’s a production tool more than anything else and it has to come with ethical standards and ethical frameworks.”
With the guidance, Reykdal and OSPI will also develop a plan to educate teachers on AI tools.
Reykdal sees the technology as having positive potential in changing how students approach their work and their thinking.
With the current text-based AI tools, students receive heaps of information almost instantaneously.
“Instead of spending hours and hours finding information, students have to spend hours verifying information,” Reykdal said. “That’s a really cool critical thinking tool.”
One of the concerns around AI is the bias inherent to the technology.
“These models are only as good as everything we’ve ever thrown on the internet,” Reykdal said. That could mean an overrepresentation of Western thought and a likely white bias.
Reykdal believes with growing AI use in schools, teachers will need to ensure students scrutinize the information they receive after they prompt an AI.
“The teacher can say, ‘You produced the thing — easy. Now, I want to spend time now on why you think you got that answer? What other perspectives are there? How do you know there’s research behind that?’” Reykdal said.
In addition to scrutinizing the sources of information, students must also hone the questions they ask of these AI models in order to receive the information they need.
“Students can show their thinking,” Reykdal said.
AI scripts, the textual record of the questions asked to an AI and its responses, can allow teachers to see how students’ minds work when approaching a problem, according to Reykdal.
In the past, Reykdal explained, this type of critical thinking was only obvious in students who were already great writers. Now, with AI, students who maybe aren’t the best at writing and who have strengths in science and math, “can express that information now using language they didn’t have before,” he said.
AI could also help students who don’t speak English natively and make a difference in their education and how they’re perceived by their teachers.
“It’s a huge opportunity to get to what kids really know and inquire about,” Reykdal said. “I think that’s incredibly positive.”
However, concerns remain around AI. For instance, the new guidance will need “the right ethical framework” since plagiarism is still a concern, Reykdal said.
Reykdal also worries about the future of AI. Restrictions on users and prioritizing any particular publisher will create more biases, he explained.
“Let’s not turn this into something where the influencer who has the most money can get a biased result for someone who looks up questions for an essay on the Civil War,” he said.
For students to benefit from AI as a tool, Reykdal believes Congress needs to act and curtail private money to impact AI query results.
“If we’ve learned anything from social media and its impact on young people it’s that algorithms should have some sort of periodic review to ensure that they’re not being written for bias,” Reykdal said.
OSPI will convene a committee to discuss the draft of the AI guidance first on Jan. 11. An initial draft should be distributed sometime next week.