Article – Premier Danielle Smith revealed something unexpected during her Saturday radio show. She uses ChatGPT regularly. Her cabinet ministers do too. The admission sparked conversations across Alberta about artificial intelligence and its role in government decision-making.
Smith explained her approach with refreshing candor. She turns to AI when wrestling with complex policy questions. The technology helps her scan through mountains of information quickly. It shows her what other jurisdictions are trying. But she’s quick to add an important caveat. The AI gives her a framework, nothing more. Real humans still need to verify everything.
This matters more than you might think. Alberta’s government is making decisions that affect millions of people. Knowing that AI serves as a research assistant, not the final authority, should reassure folks. Smith emphasized that her team always double-checks the AI’s work. They talk to real experts. They verify the conclusions before moving forward.
The premier sees AI as a productivity tool rather than a threat. She made a bold prediction on air. People who learn to use AI will replace people who don’t. It’s not about machines stealing jobs. It’s about workers who embrace new tools outpacing those who resist change.
That perspective resonates in Edmonton’s evolving economy. Our city has always adapted to technological shifts. From the oil boom’s early days to today’s tech startups downtown, Edmonton knows how to pivot. AI represents another chapter in that ongoing story.
But Smith isn’t blind to the dangers lurking in this technology. She expressed particular worry about deepfakes. These convincing fake videos can make anyone appear to say or do anything. The technology has already fooled people around the world. Someone could create a video of Smith announcing a fake policy. Or worse, saying something damaging that never happened.
The provincial government is drafting legislation to address this threat. Details remain scarce, but the intent is clear. Alberta wants guardrails around AI’s most harmful applications. Deepfakes top that list because they undermine trust in what we see and hear.
I’ve watched deepfake technology evolve over the past few years. Early examples looked obviously fake. Now they’re frighteningly realistic. A friend recently showed me a video that I would have sworn was genuine. Only it wasn’t. That moment changed how I think about video evidence.
Schools present another frontier in this AI conversation. Smith mentioned working with school boards to determine appropriate boundaries. Should students use ChatGPT for homework? Where’s the line between helpful tool and academic cheating? These questions don’t have easy answers.
The premier did highlight one promising educational application. AI can help newcomers to Alberta learn while they’re still mastering English. A student from Syria could study math in Arabic while simultaneously improving their English skills. That’s genuinely transformative for families settling in Edmonton’s diverse neighborhoods.
I think about the students at McNally High School or Strathcona High. Many come from immigrant families. Many speak multiple languages at home. AI tools could help bridge the gap between their native language and classroom instruction. That levels the playing field in meaningful ways.
Alberta’s aggressive courtship of AI companies brings its own complications. These data centers consume enormous amounts of electricity. Smith addressed this concern directly. The province welcomes AI companies, but with conditions attached. They must generate their own power. They can’t drain Alberta’s grid and drive up everyone’s electricity bills.
That’s a practical stance given Edmonton’s already strained power infrastructure. We’ve all seen our utility bills climb in recent years. Allowing massive data centers to pull from the public grid would make that worse. Requiring companies to build their own power generation shifts the burden appropriately.
Water usage poses another challenge. Traditional data centers use water for cooling. In a province where water conservation matters, that’s problematic. Smith mentioned newer facilities using solvents instead of water. This technological adaptation could make AI infrastructure more sustainable here.
Edmonton already hosts several data centers. Drive past certain industrial areas and you’ll see these nondescript buildings humming with activity. They generate heat and noise. They require substantial resources. As AI expands, we’ll likely see more of them.
The premier’s comments reflect a government trying to balance competing interests. She wants Alberta positioned at the forefront of AI development. The economic opportunities are too significant to ignore. But she also recognizes the technology’s darker potential.
This dual approach makes sense given Alberta’s circumstances. Our economy needs diversification beyond oil and gas. Tech industries, including AI, offer that path. But rushing headlong into any new sector without safeguards would be reckless.
I appreciate Smith’s transparency about her own AI use. Politicians often hide behind vague language about technology. She could have dodged questions about whether the government uses ChatGPT. Instead, she owned it. That honesty builds credibility even among people who disagree with her policies.
The legislation she promised will be worth watching. How do you legally define a deepfake? What penalties discourage their creation? How do you enforce rules about digital content that crosses borders instantly? These aren’t simple legal questions.
Alberta joins other jurisdictions wrestling with AI governance. The European Union recently passed comprehensive AI regulations. Several U.S. states have introduced their own bills. Canada’s federal government is developing a national framework. Provincial legislation would add another layer to this evolving regulatory landscape.
For regular Albertans, these conversations might seem abstract. But they affect daily life in concrete ways. That job interview could involve AI screening your resume. That loan application might get evaluated by an algorithm. That news video might be completely fabricated.
Understanding how our government approaches AI helps us navigate these changes. Smith’s comments suggest a pragmatic middle ground. Embrace the technology’s benefits while guarding against its abuses. Whether that balance holds remains to be seen.
Edmonton’s role in this AI future deserves attention. Our universities conduct cutting-edge research. Our startup community continues growing. Our diverse population brings perspectives that can shape ethical AI development. We shouldn’t be passive observers in this transformation.
The coming months will reveal more about Alberta’s legislative plans. Public consultations seem likely. Industry stakeholders will weigh in. Civil liberties advocates will raise important concerns. Then comes the hard work of drafting laws that actually accomplish their goals.
Smith made clear that AI is here to stay in government operations. That’s probably true across every sector. The question isn’t whether we’ll use this technology. It’s how we’ll use it responsibly. Alberta’s answer to that question matters for all of us living here.