Blockchain offers opportunity to contain ‘chaotic robots’

A large language model can already function as an as assistant, to do all your routine office tasks, which is fine if you don’t mind daffy outcomes.

In their paper, Institutions to constrain chaotic robots: why generative AI needs blockchain, Chris Berg, Sinclair Davidson and Jason Potts (all RMIT) warn, the problem is that LLM’s “creativity, hallucinations or lack of alignment with our intentions,” make letting them loose, even on modest tasks, like email, a big risk. 

But they have a solution that does not require thinking about AI safety on a “civilisational timescale” – users creating rules that govern LLMs connected in a system. 

As things stand, “LLMs are chaotic robots: they are too unpredictable, their outputs appear too ‘random’, to be trusted to control systems that need security and predictability,” the RMIT team warn.

So they propose a way for LLMs to safely do useful tasks, without them going off on frolics of their own, using blockchains for “contract-like agreements enforced by code that specifies what their human users want them to do and not do, with tamper proof content, via as many blockchains as groups of users want.”

“A smart contract wrapped LLM is a secure wrapper on an instruction to a compute system.”

This, they argue is a better way to establish security than by a regulator, “because there is no hierarchical power without discretion … precisely the problem that blockchains originally sought to solve through distributed consensus.”

And it makes LLMS the “killer app” for smart contracts.

“With smart contracts, LLMs can interact safely in the real world and can unlock the vast economic opportunity of economically aligned and artificially intelligent agents,” they suggest, as AI creates value by automating tasks and the blockchain they are part of provides trust.

Share:

Facebook
Twitter
Pinterest
LinkedIn

Sign Up for Our Newsletter

Subscribe to us to always stay in touch with us and get latest news, insights, jobs and events!