-2

I was reading about corporations and trusts and came to wonder how one could use existing legal tools to allow a robot to be as "autonomous" as possible.

I imagine a foundation or non-profit owned by an irrevocable trust owning an AI and the associated data. If an AI was vested with management of most key parts of an entity's operations and a docile board of director, could the AI itself buy property, get contracts to "work", sell things, etc? If so, under what conditions? Would it be comparable to the autonomy of a person under guardianship (theoretically and legally, I know it's not the same thing, but in practice)?

I have found some information on other stacks, but nothing that constitutes a complete answer to my queries.

Thank you for your help!

EDIT: I know that an AI is not a legal entity. By saying that the AI "buys, sells, etc" I didn't mean AI as a legal entity, I meant AI as a principle of operation of a legal entity, like a non-profit. My question refers to autonomy in practice This is a very speculative thought exercise. I guess that the real question is: What imaginative detours could be taken to get around the legal impossibility for an AI to be a legal entity.

I thought the trust might be a good place to start, since in Canada there are data trusts or trusts protecting forests in perpetuity. I thought, why not an AI?

Alexis
  • 17
  • 1
  • 1
    AI's are not recognized as legal entities, anywhere. https://law.stackexchange.com/questions/77487/i-want-to-transfer-ownership-of-my-ai-startup-llc-to-my-ai-how-would-this-chang – BlueDogRanch May 17 '22 at 21:41
  • 2
    In your scenario, the board of directors and not the AI would be performing the management. The board is free to follow the AI's recommendations, but not bound to do so. Rules that bind the board to always follow the AI's recommendations would likely be unenforceable. – amon May 17 '22 at 21:46
  • 1
  • It helps to focus my question, but I realize that the core of my question is perhaps less legal than I thought. This is a very speculative thought exercise. In any case, these are not quite the answers I was looking for. I can see the point, but I was wondering what imaginative detours could be taken to get around the legal impossibility for an AI to be a legal entity. But thank's a lot for your help! – Alexis May 17 '22 at 22:05
  • While related, the linked questions are by no means duplicates. They focus on whether an AI can be a legal entity, legally owning property. This question seems to focus on whether an AI can be given apparent authority, so that a person interacting with it would belive that it was authorized to execute binding deals, and perhaps would also think that it was human. – David Siegel May 17 '22 at 22:32
  • 1
    An AI that deserves a status as a legal entity should be clever enough to figure out itself how to achieve this :-) – gnasher729 May 18 '22 at 11:04

2 Answers2

2

Experienced director here.

I don't see a problem with "AI as corporate CEO" concept.

There's a human Board of Directors, and each director has a duty of care to act in the corporation's best interests. Those "best interests" are:

  • comply with the law.
  • make money to the satisfaction of its owners *
  • conform to its stated mission. **

But as long as all of this stuff is on-track, the Board's role is largely as a "Safety Control Rod Ax Man". You let the business run using the well-chosen managers, and observe. ***

As long as nobody's screwing up the above to the dissatisfcation of stakeholders, people are going to leave you alone.

So if the AI is programmed with appropriate constraints, or if it has a "morality core" and a sense of prediction and consequence, it is unlikely to allow a situation where the Board would be obliged to intervene.

Obviously if the AI goes all GLaDOS, then the Board will be obliged to intervene, or breach of their duty of care, and they can be held liable for that.

* Owners. Corporate owners are allowed, in as many shells as you like... however there must be human owners at the bottom of this pile of turtles. However as long as all the human owners are A-OK with the business conduct, you're fine. An appropriate structure here might be the "Not-for-profit" or "not-for-dividend" organization, where the organization is defined by a mission instead of a profit motive. In theory the owners are "The People" via the Attorney General; but in 200 Board meetings I've never seen either one show up LOL.

** The "mission" applies to non-profits or other structures which gain a tax benefit for focusing on a recognized charitable purpose or perform a role of government. In practice you answer to the IRS; they require an informational tax return where you document what you are doing with charity money. If you have dispensed with human owners by operating in a charitable space, then your AI needs to take extra care to conform with the requirements of that space.

*** The Board also picks human "officers", such as President, Secretary and Treasurer as state law requres, who are officially responsible for management. But these are often ceremonial roles if you have a well established staff, the paper officer must only jump in if the staff has failed to do the job. Once I was President and my sole job was to convene the next general membership meeting. 2 people attended and one was me. State law defining these roles requires that the assignee oversee that the job is done. It does not require that the assignee do the actual job personally.

Harper - Reinstate Monica
  • 19,563
  • 2
  • 27
  • 81
  • Under the law of most U.S. states there must be at least one human being named as a corporate officer. – ohwilleke May 18 '22 at 15:05
  • @ohwilleke sure, but it is common for those to be largely ceremonial roles, with the actual work delegated to staff. Been there done ... well, not much really. – Harper - Reinstate Monica May 18 '22 at 17:19
  • 1
    @Harper-ReinstateUkraine -- I take your larger point but President, Treasure and Secretary of a corporation are commonly ceremonial? In my experience they are actual jobs with responsibilities. – George White May 18 '22 at 18:59
  • 1
    @George I don't mean in every case, but I have certainly sat on boards where there was a paper treasurer and secretary, but all eyes turned to a manager for the actual reports in those areas. The person with the nameplate nominally managed the staff, but the staff didn't really need management. "The person who has the nameplate" and "the person who does the job" need not be the same. – Harper - Reinstate Monica May 18 '22 at 19:38
  • @Harper-ReinstateUkraine I don't disagree. But the point is that you need some human with legal authority in the loop even if the AI serves as basically a general manager doing most of the work. – ohwilleke May 18 '22 at 19:42
  • 2
    @ohwilleke Sure. So the AI is confined to activities which the humans won't find objectionalbe. Of course, humans have the same constrictions. – Harper - Reinstate Monica May 18 '22 at 20:14
  • @Harper-ReinstateUkraine - I think you are talking about a secretary, for example, of the board. I’m talking about a corporate officer of the company. – George White May 19 '22 at 01:21
  • @GeorgeWhite I assume we're both talking about the corporate officers which are required by state law (which vary). Where I come from, the officers which must exist are president, secretary and treasurer. They don't define any other mandatory officers. So you don't need to create them lol. I can certainly relate to thinking of them as officers "of the Board", but they are in the eyes of the state officers "of the company". – Harper - Reinstate Monica May 19 '22 at 01:41
  • That’s what I was talking about also and it’s hard for me to image them as commonly placeholders. – George White May 19 '22 at 06:16
1

There is no such thing as a "docile" director

A director who acts on the instructions of anyone else - be that another person, the author of their horoscope, or the programmer(s) of an AI, is failing in their duty as a director. A director must use their own judgement in the discharge of their duties.

Dale M
  • 208,266
  • 17
  • 237
  • 460
  • But what if their duty is to respect and protect the will of the AI? Since the enterprise is owned by the trust whose sole purpose would be to own the AI data and the enterprise mobilizing the AI resources. Do you understand what I am trying to imagine? – Alexis May 17 '22 at 22:52
  • 2
    The AI is not a person. The company is. The director’s duty is always to the company – Dale M May 18 '22 at 01:10
  • 2
    Oh. And trusts can’t own anything. The trustees own it. – Dale M May 18 '22 at 01:11
  • 1
    @Alexis "respect and protect the will of the AI" is not a valid corporate or charitable mission statement. So the organization would have to have one, and the AI would have to operate within the domain of the organization's best interests. But that isn't any diffferent for humans, we have to obey laws too. – Harper - Reinstate Monica May 18 '22 at 19:52