Microsoft President Brad Smith has called for the government to take a more active role in controlling the development of artificial intelligence (AI). Speaking at a panel discussion in Washington, Smith stated that AI may be the most consequential technological change in our lifetime and that the government needs to move faster to stay on top of its developments. Smith acknowledged that the private sector also has a role to play in controlling how AI develops. As part of the private sector’s role, Smith announced Microsoft’s “5-point blueprint for governing AI,” which aims to bring the public and private sector together so that AI “serves all society.”

Licensing for Companies Working on Advanced AI Models

Smith also suggested that the government should require companies working on advanced AI models to obtain a license. This would mean notifying the government when testing begins and sharing results from ongoing operations with the government. Even when licensed for deployment, companies would have a duty to continue monitoring and reporting any unexpected issues that arise. Smith’s comments come after more than 1,100 industry insiders signed an open letter in March asking governments to “pause giant AI experiments.” The letter called for all AI labs to immediately pause for at least six months the training of AI systems more powerful than GPT-4.

Smith acknowledged that Microsoft, one of ChatGPT-maker OpenAI’s owners, doesn’t necessarily have the best information or the best answer, or may not be the most credible speaker. However, he stated that people are looking for ideas, especially in Washington D.C. According to some estimates, Microsoft has spent around $13 billion on backing OpenAI and integrating the ChatGPT into its Bing search engine.

Microsoft President Brad Smith has called for governments to take a more active role in controlling the development of AI. Smith believes that AI may be the most consequential technological change in our lifetime and that the government needs to move faster to stay on top of its developments. Smith also suggested that the government should require companies working on advanced AI models to obtain a license, which would mean notifying the government when testing begins and sharing results from ongoing operations with the government. Even when licensed for deployment, companies would have a duty to continue monitoring and reporting any unexpected issues that arise. Smith’s comments come after more than 1,100 industry insiders signed an open letter in March asking governments to “pause giant AI experiments.”

Blockchain

Articles You May Like

Bitcoin Could Drop Below $25,000 to Flush Out Speculators
DEUS Finance Recovers Over $6 Million Lost in Hack
Real-World Asset Protocols in DeFi: Tokenization and Financialization
Crypto Investment Firm Launches $200 Million Digital Asset Startup Fund

Leave a Reply

Your email address will not be published. Required fields are marked *