1699715033008.png


If you followed the news yesterday, you may have heard that the Biden Administration announced its first-ever Executive Order on AI. We at Fstoppers are here to break down for you what the main points were, and how they will affect you as a photographer.

In his landmark Order, Biden expressed that the goal of these guidelines was to “ensure that America leads the way in seizing the promise and managing the risks of artificial intelligence (AI).” The executive order’s goal is to implement a comprehensive strategy that protects Americans from threats that could use AI technology while promoting innovation and competition in the field. The order also notes the goal to protect American privacy.

istock-1219303803.jpg


The opening paragraphs remind us that Biden has secured the commitment of 15 leading companies to drive the safe and trustworthy development of AI. Included in these are Google, Meta, Open AI, Microsoft, IBM, and — most importantly to us — Adobe. This new order has a wide range of purposes: from preventing AI development of dangerous biological weapons, to addressing algorithmic discrimination and more. In this article, I will touch on the ones which are most relevant to the photography industry.
  1. The executive order will evaluate how agencies collect and use commercially available information. This includes information they procure from data brokers and strengthens privacy guidance for federal agencies to account for AI risks. This work will focus in particular on commercially available information containing personally identifiable data.
  1. The Department of Commerce is tasked to develop guidance for content authentication and watermarking to clearly label AI-generated content. This will serve to protect Americans from AI-enabled fraud by establishing standards and best practices for detecting AI-generated content and authenticating official content;
The White House brief didn’t go into detail about the outworking of these mandates, but they are part of a broad effort to both protect against AI, and to bolster its development.

istock-1474791374.jpg


Does this mean we will be protected against the unauthorized use of millions of our images by companies such as Midjourney to train its AI? Surely not. This executive order is a guideline set by the administration for the purpose of leading in ethical development. It is not, however, a law.

For me, a highlight of the briefing was seeing Adobe listed as a company voluntarily partnering with the administration to share information regarding data collection and security testing. I look forward to seeing the standards that will be developed by the Department of Commerce for labeling AI-generated content. Of course, we at Fstoppers will keep you appraised of the developments as they roll out.
  • + 1
Reactions: NikoCarol