Artificial Intelligence
-
A survey of 386 global experts suggests governments, businesses, educators and communities must act together to counter dangerous overreliance, displaced workers, mental health problems and other risks from AI.
-
The document outlining the Trump administration’s approach to AI signals less regulation and more innovation. To plan for it, state and local governments must understand what it includes — and what it omits.
-
Napa Valley Unified School District's school board recently approved 10 principles to guide AI use by students and staff, mirroring recommendations from the nonprofit California School Board Association.
More Stories
-
Companies are embracing cyber defenses based on generative AI hoping to outpace attackers’ use of tools like FraudGPT, the “villain avatar of ChatGPT.” But more effort is needed, experts warn.
-
The Nampa City Council authorized the department to buy nearly $79,000 worth of technology from Cellebrite, a company that sells tools to unlock phones and obtain their data for police and government agencies.
-
Led by the co-chairs named earlier this month, the members of MetroLab Network’s GenAI for Local Governments Task Force will work together to create a comprehensive resource of guidance on the use of AI technology.
-
Ohio's TALEN pilot program aims to create a statewide real-time crime center to create a network of thousands of public and private cameras. Records reveal several obstacles have stalled the project.
-
As educators learn how to navigate AI, the question remains how it will be used in the classroom, and one superintendent in Illinois says that answers must come from those at the local level.
-
Experts say AI is now present in everything from apps and facial recognition software to ChatGPT, which allows users to ask questions and receive human-like replies based on data harvested from the Internet.
-
States are starting to hire experts to navigate both the opportunities and the trickier aspects of AI. Maryland's Nishant Shah says job No. 1 is establishing a set of principles that set the foundation for everything else.
-
A prototype microchip design that was revealed today by IBM could pave the way for a world of much smarter devices that don't rely on the cloud or even the Internet for their intelligence.
-
At the Google Public Sector Forum, the tech giant announced new efforts focused on AI, citizen engagement, cybersecurity and other areas. The work could influence activities at all levels of government.
-
A new report released last week by the Urban Libraries Council outlines five recommendations of how public libraries can use artificial intelligence technologies in their work to serve communities.
-
As state and local governments cautiously pursue AI, they must prioritize ethics, transparency and accountability in procurement to protect public interests and deliver on the technology's potential.
-
New York University and the Korean Advanced Institute of Science and Technology will collaborate on research to study how advances in artificial intelligence will impact society moving forward.
-
Starting this spring, Louisiana State University's humanities and social-science departments will begin teaching students how to use artificial intelligence in research related to their fields.
-
New York City has launched the MyCity Business Services chatbot in a beta form to help residents get information about starting or operating their businesses. The city also released an AI Action Plan to guide responsible city government use of the tech.
-
Artificial intelligence is quietly revolutionizing non-emergency calls in 911 dispatch centers.
-
At the NASCIO Annual conference in Minneapolis, Arkansas CTO Jonathan Askins echoed the sentiments of his peers in his cautious optimism about AI in government and said they won’t have a second chance to get it right.
-
By the end of the year, Baltimore residents who don’t speak English will be able to communicate with 911 services in their native language, without waiting for an interpreter, officials say.
-
Half of teachers say they know a student who was disciplined or faced negative consequences for using — or being accused of using — generative artificial intelligence like ChatGPT to complete a classroom assignment.