At the Exceptional Women Alliance (EWA), we enable high-level women to mentor each other to achieve personal and professional happiness through sisterhood. As the nonprofit organization’s founder, chair, and CEO, I am honored to interview and share insights from thought leaders who are part of our peer-to-peer mentoring.
This month, I introduce to you attorney Wendy Lee, a leading AI and fintech law specialist at the Buchalter law firm. Wendy will share insights on the Executive Order on the Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence.
How did you become an expert in AI and fintech?
I have been working in technology since my undergrad days, where I fell in love with database concepts and administration and teamed up with a video game company to create its point-of-sale system. In addition to this “hand on keyboard” experience, for the past 25 years I’ve advised technology companies from startups to established platform SaaS solution providers on regulatory compliance, strategic transactions, data sharing, and services contracts. I served as chief legal officer at a mortgage fintech as generative and other forms of AI hit the mainstream market, using the National Institute of Standards and Technology (NIST) Principles of Explainability to guide my practice for safe deployment of AI solutions in highly regulated environments.
Let’s dig into the legal side of things. The White House EO talks about fairness and accountability. How does that impact AI law?
There isn’t federal AI law yet, but the policies are emerging. The Biden-Harris administration is working on policies and has asked all agencies to develop standards, tools, and tests to ensure its AI systems are safe, secure and trustworthy. The government is putting on its own oxygen mask before passing any real federal law. And clearly the administration wants to influence whatever Congress may attempt to pass to further regulate this technology.
The Order wants government and companies to work together. How does that change who’s responsible for AI?
The Order and its reports do nothing to change existing laws, which would be applicable to harms that result from whatever the root cause is. In other words, if an AI-powered car caused an accident, the driver responsible for the car would still be responsible for the accident. If a financial institution made a discriminatory credit decision based on its AI tool’s output, that institution could be held in violation of applicable fair lending or servicing laws regardless of the tool used to formulate the decision.
If an AI algorithm messes up, does the Executive Order change who gets sued?
An executive order cannot change underlying product liability and other laws at a state and federal level, which balance risk and create causes of actions to compensate injured persons and corporations from harms. AI harm is no different in that regard.
What new worker protections might we need with AI?
Influenced by the executive order, the U.S. Department of Labor Wage and Hour division issued a memorandum to help guide its staff on how to consider using worker productivity AI tracking. This is while applying federal wage and hour laws to employers who are using AI to supplement their time tracking systems. Recommended worker protections include making it clear that employers are responsible for the AI technology accuracy, including new automated scheduling, tracking, and geolocation tools. We will see more guidance like this example designed to help protect workers from any negative impact because of their employer’s use of AI technology.
Can this Order help countries agree on AI rules?
Great question and noble goal, but no. There were some protectionist discussions and requirements to put in reporting requirements for foreign actors who might be training U.S.-based models and for any foreign resellers of this technology for U.S.-based companies. But we do need to join forces and align on this topic because technology doesn’t see borders like humans.
Can we regulate AI without killing new inventions and patents?
This will be the ultimate balance. Use case developers will be best served by partnering with legal and compliance teams to make sure they have considered federal and state laws while designing safeguards for their technology and services offerings. For patent rights, the USPTO’s recent guidance indicated that human contribution will be a factor in rendering patent decisions.
How will the Order affect “explainability” impact in the law?
I love this question for so many reasons. My first career was teaching working professionals how to use Microsoft tools in Fairbanks, Alaska in 1996. This was a tremendous opportunity for my 20ish year old brain to interact with some of the most intelligent brains in the oil industry on emerging technology. What I learned then, and still believe to be true, is that this will be one of the most challenging aspects of bringing this technology to market. U.S. consumers crave simplicity in their tech. Apple and Google have capitalized on this and derived great wealth in doing so. Luckily NIST has long been thinking about this, and in 2021 issued a fantastic report on how to explain AI-related technology in 23 pages. That said, many legislators are not lawyers, and they will inevitably attempt to pass legislation based on technology that has been ill-explained or categorized based on its bad output. This is a classic reaction to new things and I don’t anticipate we will be immune from this as the technology spreads into our country.
How does this Order change privacy protection?
There is no federal privacy protection in the United States. We are operating under a patchwork of state laws that are ill equipped to meet modern data uses. Europe, Brazil, and China have comprehensive protections for its citizens. There are many attempts to legislate on this. The American Privacy Rights Act is pending in Congress now. Only time will tell if we will join the world stage on this important area of legal protection.
Larraine Segil is founder, chair, and CEO of the Exceptional Women Alliance.