-
After transitioning from Fairfield University’s leader of enterprise systems to director of IT strategy and enterprise architecture for the state of Connecticut, Armstrong will return to higher-ed leadership in January.
-
State governments are expected to deploy AI in 2026 with an increased focus on returns on investment as they face complex policymaking restrictions enacted by a recent executive order signed by President Donald Trump.
-
To prevent students from relying on artificial intelligence to write and do homework for them, many professors are returning to pre-technology assessments and having students finish essays in class.
More Stories
-
During a virtual event hosted by the Brookings Institution, experts and lawmakers explored the benefits and risks of AI, as well as the possible regulatory structures that could help guide its advancement.
-
Sacramento City Unified School District has implemented a policy barring students from using generative artificial intelligence for homework or research without a teacher's approval.
-
Tech leaders gathered in Washington, D.C., this past week for public and private meetings with Congress on the future of AI in the U.S. What happened, and what’s next?
-
A five-year program coordinated by the University of Texas at San Antonio and UT Health San Antonio allows students to work toward a medical degree and a master's in artificial intelligence at the same time.
-
Researchers at Carnegie Mellon University devised a string of code that could unlock ChatGPT and make it do things it was programmed not to. Now they're working on a "mind reader" tool to study how it makes decisions.
-
A new push from startup gov tech firm Polimorphic seeks to offer more efficient and precise searches of government websites. If successful, it could lead to reduced calls to public agencies and happier citizens.
-
Recently addressing the disruption ChatGPT and other tools have brought to global education, the international cooperative agency recommends new laws and regulations, training and forward-thinking public debate.
-
Catching convincing AI-fabricated evidence is still a work in progress, but courts could benefit from thinking now about how they might confront the challenges posed by the emerging technology.
-
Greg Brockman will have an onstage conversation with UND President Andrew Armacost, to be followed by a panel discussion with faculty from computer science, law, mathematics, entrepreneurship, writing and theater arts.
-
As Hollywood actors and writers continue to strike for better pay and benefits, California lawmakers are hoping to take action that will protect workers from being replaced by their digital clones.
-
More than 20 tech and civil society leaders, including the chief executives of five of the biggest U.S. companies, appeared at a closed-door Senate meeting this week to shape how artificial intelligence is regulated.
-
California has taken one more step toward regulating the booming AI industry, this time with a broad strokes bill from a state senator that aims to regulate how the technology is built and how it affects Californians.
-
The use of chatbots is exploding across government agencies at all levels, according to survey data. A local government expert weighs in on the dos and don’ts of implementing one that actually works.
-
As generative artificial intelligence products rise, there are still pressing ethical issues that need to be addressed, such as, what do AI companies owe to the creators whose work informs their chatbots?
-
The collaboration and shared learning made possible through the smart region consortium known as The Connective enhances tech work for cities that are members — such as Phoenix, Mesa and Surprise, Ariz.
-
Though their services are illegal in some countries, companies that combine generative AI and human labor to write essays that are undetectable by anti-cheating software are soliciting clients on TikTok and Meta.
-
For families and students who lack home Internet or personal devices, the introduction of technologies like artificial intelligence in schools may only exacerbate digital inequities.
-
To append what students learn about AI in school, developers should produce guidelines on how to use their products in a way that’s readily understood by people with varying degrees of “traditional” and digital literacy.