Jeannie

There was an article in The Mandarin on 22/02/2023 that described the event where a social worker spoke to a very upset client about a debt on a Friday, and learned on the following Monday that the person had suicided. They had known when they spoke to him that what they were saying was a lie – there was no valid debt, only a made-up one. They went on leave and never came back to work – they were broken by the experience. The utter bastardry of what happened with Robodebt should never be allowed to be repeated.

The article is behind a paywall, and carries a warning about distressing reading.

https://www.themandarin.com.au/212831-compliance-officer-reveals-how-robodebts-toxic-boiler-room-culture-broke-clients-and-staff/?utm_campaign=TheJuice&utm_medium=email&utm_source=newsletter

We are a commercial organisation with something to sell – pardon the self-interest.

We suggest using our Active Structure #AGI tool to embed the exact meanings in the documentation of the legislation. No, not just the words, but the exact meaning of the word in that position in the document (some common words have fifty to eighty meanings). Then the enhanced document can be read by non-lawyer, and even the public, and companies who seek to implement applications from potential clients.. It is clear that many of the government’s lawyers did not understand the small amount of mathematics sprinkled through the legislation.  An example of different specialties.

A problem arises when people from different specialties read a document – they use unconscious processing to work out what the meanings are, and because it is unconscious, they don’t realise that someone else from a different specialty thinks it means something else, or they don’t really know what it means – it is just a blur. 

An associated problem is “clumping” – the interweaving of adjectives and prepositions. An example – “he put the money from the bank in Fresno on the table in his office”. When we read it, we unconsciously create a clump of “the money from the bank in Fresno”, and then jump back to “put the money on the table in his office”. We can’t do it consciously because of the severe limit on how much we can handle consciously (the Four Pieces Limit). The augmented document shows how objects have been “clumped” into more complex objects – if you don’t understand that, or would clump them differently, you will misinterpret the document. 

Yes, there is a cost to do this, but it is insignificant when thousands of people need to interpret it correctly, and millions of people rely on it being interpreted correctly.

An augmented piece of legislation using AGI would provide an authoritative source, and can function as a working model for test cases, as the structure can be rendered active – that is, it does what the legislation says because the words have become pieces of machinery connected together, not just marks on paper (or pixels on a screen).

Why can’t we use a chatbot? A typical chatbot, like ChatGPT, does not understand the meaning of even one word in the text it has stored – it puts out strings of words because the probability of a particular word following another is higher than others – it knows nothing about meanings or clumping, but simply regurgitates what it has.

But would an authoritative source be accepted, given the difficulties – ‘fear, denial and silence”. Another article to follow.