Meet Spellbook the GPT-3 Generative AI Word Add-In For Contracts

0
9
Meet Spellbook the GPT-3 Generative AI Word Add-In For Contracts

In another example of the use of generative AI approaches in the legal sector, Toronto-based Rally has launched a GPT-3 based add-in for Word called Spellbook, which is designed to help lawyers with legal drafting.

Spellbook’s use of OpenAI’s GPT-3 large language model, an AI trained on 45 terabytes of data from books and the internet, is further ‘tuned’ on legal datasets for ‘optimal contracting performance’, they explained.

It can handle the following tasks:

  1. ‘Language Suggestion: Spellbook can draft new clauses and sections, taking the full context of the contract into account.
  2. Negotiation Suggestions: Spellbook can list common points for negotiation based on the contract
  3. Term Summaries: Spellbook can instantly create short term summaries for any common contract.’

Artificial Lawyer was understandably curious to know some more, especially after recently highlighting the work by PatentPal, which uses a non-GPT-3 generative AI model.

This site asked Scott Stevenson, CEO of Rally – which provides its core legal management platform to 110 law firms – about how they are leveraging this technology.

When did this start and what is Rally? 

Rally is five years old, we’re a legal automation platform targeted at businesses and their law firms. Spellbook is our new GPT-3 based drafting and review tool that we officially launched on September 1st this year.

How do you get the right text examples into the GPT-3 system so that they can be drawn upon? 

We have a proprietary set of documents, built using open-source documents, our own legal team, and our own templating technology which allows us to create many variations. We use these for fine-tuning, few-shot examples and prompt engineering – which helps Spellbook perform significantly better than raw GPT-3 at legal tasks.

Since GPT-3 is trained on much of the internet, including public databases like EDGAR (as well as non-fiction books and Wikipedia), it has had exposure to many contracts and has a sort of ‘general knowledge’ about them. We are able to constrain outputs to the contracting domain.

But, isn’t GPT-3 just a very random collection of texts? How can a lawyer rely upon its output? 

GPT-3 has consumed a wide variety of texts, just like a well-read lawyer! This actually helps give it general knowledge (e.g. when reviewing an employment contract, it has knowledge of salaries in different jurisdictions from websites like Glassdoor.com).

We’ve been very effective at constraining outputs to the legal domain. GPT-3 has digested tonnes of legalese. That said, we don’t think lawyers should trust Spellbook. It is more like a muse that gives marble to carve where you didn’t have any. It’s also a ‘Second set of eyes’ for detecting missing provisions, missing definitions, unusual language, etc.

Do you use playbooks for the new text? 

Currently, no, but this is a highly requested feature that is on our roadmap.

And, what happens to the documents? How can you control privacy if the document is going to a 3rd party, e.g. OpenAI? 

Privacy is deeply important to us, in our core product and especially in Spellbook:

  • Data is encrypted for its whole journey
  • We do not store any document data at Spellbook
  • Data is sent to OpenAI’s GPT-3 servers for processing – there is no way around some kind of 3rd party data processing for using this kind of technology – these models are too massive to run on your own computer.
  • Data is not accessible to any 3rd parties outside of OpenAI
  • We opted out of OpenAI’s programme to use data to train their model further, so it will not be used for that purpose
  • OpenAI may retain encrypted data for up to 30 days:
    • This data is encrypted and is only accessible to authorised engineers for (1) debugging purposes in the event of a failure, (2) investigating patterns of abuse and misuse or (3) improving the content filtering system through using the prompts and completions flagged for abuse or misuse.

Overall, OpenAI is a secure, reputable company with many high-profile customers like Microsoft. Cloud-computing is the norm now–if a lawyer is using tools like Office365 or the latest version of Word, they are also sending documents to a 3rd party, and storing the data there much longer.

Thanks Scott, very interesting work!

So, there you go. This site has not seen a detailed demo yet, but this looks promising albeit the privacy aspect of working with OpenAI is still something of a conundrum. But, Rally believes they have sufficient safeguards in place. (There is however a short promotional video provided by Rally that you can check out below.)

A Rally Production, 2022.

More broadly, the approach of using generative AI to tap a corpus of texts that relate to what you are writing makes total sense.

Of course, there is much more finessing to come – there always is. No doubt more companies will come to market with different approaches and the techniques will be refined, just as NLP doc review has evolved over the last decade in the legal field. The first NLP ‘doc AI’ companies in the early 2010s had a lot of development ahead of them before they reached the levels of proficiency they have now – but they got there.

To conclude, kudos to Rally for experimenting and exploring, that’s what legal innovation is all about.