Corporate Liability in the Age of Artificial Intelligence

In the hearts and minds of almost everyone in business, artificial intelligence (AI) stands out as a versatile and powerful tool with the potential to revolutionize everything. AI, when used intelligently and with a human in the middle, becomes a great tool for eliminating drudgery and providing some pretty impressive productivity gains.

However, if it's used as a direct touch customer interface, it has drawbacks. AI is only as good as its training, and even with great training and guardrails, it can still just make things up. AI is a computer interface into a business's heart, and while it can help simplify and streamline a lot of things, the question arises as to who bears responsibility when AI makes mistakes.

Recent legal battles, such as the case involving Air Canada's chatbot providing incorrect information to a consumer, have shed light on the accountability of companies that are using AI. Despite Air Canada's initial denial of responsibility, a court ruling favored the consumer, underscoring the company's liability for what was effectively considered to be information displayed on its website.

<>While impressive in its capabilities, AI is not infallible and has some fundamental drawbacks. Hallucination, which is inherent in the core functionality of how it works, will fabricate answers based on algorithms and not based on research. Guardrails are another key drawback of AI interfaces, providing programmatic limitations on what the AI can discuss or accept. Unfortunately, guardrails can never be comprehensive and can often be hacked to enable off-script behaviors.

At its core, AI is still just a sophisticated computer program adept at processing very large datasets and extracting patterns. However, it lacks true artificial intelligence and is still, essentially, a computer program.

This means that the computer program, which is used by the business, is an extension of that business. And that leaves the business responsible for the operation of that program.

The comparison between AI and information displayed on a corporate website is illuminating. Just as companies are held accountable for misinformation on their websites, they are similarly liable for inaccuracies perpetuated by AI applications. Despite AI's complexity, it remains a tool created and controlled by human entities, subject to the same standards of accuracy and accountability as any other corporate communication channel.

The case of Air Canada highlights the tension between companies seeking to absolve themselves of responsibility by framing AI as a stand-alone entity and legal systems emphasizing corporate accountability. While companies might attempt to distance themselves from AI's actions, courts have now shown that they will uphold the principle of corporate liability in cases of misinformation.

Consider a scenario where a company uses AI to analyze large datasets and disseminate information to consumers. If the AI generates incorrect or misleading insights, leading consumers to take detrimental actions based on this information, the company bears responsibility for the repercussions. This principle underscores the importance of ensuring the accuracy and integrity of AI systems deployed by businesses.

Moreover, as AI becomes increasingly integrated into everyday operations, companies must prioritize transparency and accountability in their AI strategies. Implementing robust quality assurance measures, conducting thorough audits of AI algorithms and datasets, and providing clear channels for recourse in case of errors are essential steps to mitigating liability risks.

Ultimately, the accountability of companies for AI's mistakes reflects a broader imperative to uphold ethical standards and consumer trust in the era of AI-driven innovation. As technology continues to advance, it is incumbent on businesses to navigate the new AI landscape with diligence, integrity, and accountability. Only by doing so can they harness the transformative potential of AI while safeguarding against its pitfalls.


Rob McDougall is CEO of Upstream Works Software.