ADVERTISEMENT

ADVERTISEMENT

Letter: AI regulation is an issue of states’ rights, national criticality

Jeremy Straub, an assistant professor in the NDSU Computer Science Department, writes, "U.S. AI regulation helps our competitors as it slows our technological development progress in this area."

Straub_Jeremy Headshot.jpg
Jeremy Straub
Contributed / Jeremy Straub

Policy makers in Washington, D.C., are beating the drums of artificial intelligence (AI) regulation. Due, perhaps, to the human-like speech capabilities of ChatGPT , those outside the traditional computer science technical circles have taken note of AI – and they are concerned. Senate Marjority Leader Chuck Schumer, D-NY, has proposed sweeping AI legislation. Some computer scientists and a variety of commentators have suggested various ways of regulating AI, ranging from a moratorium on the development of some forms of AI to AI-specific liability models to collaborative regulation .

AI regulation is good for big tech businesses , who know how – and can afford the lawyers – to navigate it. It's also good for America’s strategic competitors. Russia, for example, has interests in leading in AI. Its president, Vladmir Putin, has said that “ the nation that ‘becomes the leader in this sphere will become the ruler of the world.’ ” U.S. AI regulation helps our competitors as it slows our technological development progress in this area.

AI regulation is not good for North Dakota and its nascent AI industry and researchers. Importantly – and perhaps fortunately for those in North Dakota – the regulation of AI technology development shouldn’t be a federal issue; it is a state one. Some aspects of AI, though, are even beyond states’ regulatory authority.

The U.S. federal government was created with a limited set of powers which are enumerated in the Constitution and explicitly constrained by its 10th Amendment, which states that “the powers not delegated to the United States by the Constitution, nor prohibited by it to the States, are reserved to the States respectively, or to the people.”

The creation and non-commercial use of AI is fundamentally the expression of programmers and is, thus, protected by the First and 14th amendments to the U.S. Constitution. However, regulation beyond this is within the providence of government.

ADVERTISEMENT

The North Dakota University System has created a task force to study the subject.

While some aspects of AI use may fall under federal regulatory jurisdiction, due to their potential connection to interstate commerce or other sources of federal regulatory power, the same argument cannot be made for AI development or AI’s use solely within the same state as it was developed in. These are inherently left to states to regulate.

This means that North Dakota can control – or, better yet, choose intentionally not to regulate – the development of AI within our state. AI tools and techniques can be developed, within North Dakota, to help educate students in our schools, aid farmers in growing their crops and manage citizens’ data . North Dakota can also develop a workforce ready to embrace and utilize AI technologies for the betterment of its citizens and humanity.

We can also lead the way in technology and policy development and serve as a model for other states to emulate. The ability for different states to take different regulatory approaches, compare their results, and adapt policies based on seeing what works well in other states (and what doesn’t) is a tremendous benefit of America’s federal system. It is critical to use this approach in areas of rapid change and growth, such as AI.

State control provides an opportunity for North Dakota to create a pro-AI, pro-business regulatory environment. This will serve to attract businesses and researchers from other states who enact their own AI technology development restrictions or don’t stand up to federal ones, should they exceed federal authorities. Through the intentional deregulation of AI development, we can drive economic growth and position the state as a national – if not global – leader in the AI space.

The eventual use of AI technologies developed locally can, of course, be regulated by existing and future laws to ensure that the public is protected from any potential for harm. Use-based regulation protects technology development and is also more compatible with constitutional expression protections. Combining intentionally de-regulated development, pro-AI-development policies and use-based restrictions provides an excellent – and safe – environment for the development of AI technologies and the growth of AI tech startups within North Dakota.

Jeremy Straub is an NDSU Challey Institute Faculty Fellow and an assistant professor in the NDSU Computer Science Department. The author’s opinions are his own.

This letter does not necessarily reflect the opinion of The Forum's editorial board nor Forum ownership.

What To Read Next
Get Local

ADVERTISEMENT