top of page

Breaking Boundaries: A Comparative Analysis of Two AI Contract Review Approaches

What Limitations are there?


AI-powered contract analysis tools leverage the capabilities of large language models can provide advanced and efficient contract analysis. However, it is essential to recognise and address the limitations associated with these models to ensure accurate and reliable results.

Hallucination

One significant limitation of large language models like GPT-3.5 ("LLMs") used in contract analysis is the potential for hallucination (i.e. generating fictitious information). These models, while capable of generating contextually coherent responses, can sometimes produce content that may not align with the factual reality of the contract. This is because LLMs generate text based on patterns and associations learned from vast amounts of training data, without an inherent understanding of real-world facts or access to specific contract databases. As a result, there is a risk of the model generating inaccurate or fictional details that may not accurately represent the provisions or intentions of the actual contract.


Limitations in Analysing Tables and Schedules

While LLMs are proficient at understanding and generating text, they may struggle to accurately extract and interpret information presented in tabular or structured format within a document. Their limitations in analysing tables and schedules in agreements are caused by its reliance on sequential patterns in text data rather than a comprehensive understanding of structured data. Unlike humans who can easily interpret tabular information, LLMs currently lack the ability to recognise the distinct organisation and relationships within tables or schedules. Consequently, LLMs may not effectively discern the significance of specific cells, columns, or rows, leading to potential errors or incomplete analysis.


Legal Language and Contextual Understanding

Apart from analysing tables and schedules, LLMs may also face challenges in certain other areas of contracts analysis. They might have difficulty in understanding highly specialised or technical legal language, as well as complex legal concepts. LLMs' responses are based on patterns it has learned from training data, and they may not always comprehend the specific legal context or intricacies of certain legal domains.


'Exclusive AI Approach' vs 'Human Verification'


Many Contract Lifecycle Management (CLM) software solutions rely on a combination of AI capabilities and human review for contract analysis (the 'Hybrid Approach'), aiming to address the limitations of certain models that may lack accuracy.

However, at KeyTerms, we understand the drawbacks associated with human intervention in CLM software. We acknowledge the limitations of existing language models and believe that an alternative approach is necessary.

Therefore, we have taken a distinct path. Our models undergo continual refinement to ensure the best possible results, leveraging the capabilities of existing language models. By fine-tuning our models, we strive to provide you with superior outcomes without involving any human verification (the 'Exclusive AI approach').

With this approach, we aim to offer an advanced and efficient CLM solution that maximises accuracy and minimises potential downsides associated with human involvement. Our commitment to continuously enhancing our models empowers you with the most effective contract analysis tools available in the market.


Drawbacks associated with the 'Hybrid Approach'


The combination of AI and human verification is often seen as a way to strike a balance between efficiency and accuracy in contract analysis. By leveraging AI technology, processes can be streamlined and expedited, while human verification provides an extra layer of scrutiny to ensure precision. However, it is important to recognise that this approach does not present the full picture.

Lack of Legal Expertise

One significant drawback of relying solely on human verification is the potential lack of legal expertise. Individuals involved in the verification process may not possess the necessary license to practice law, which raises concerns about their ability to provide sound advice on complex contractual matters. Even if some platforms utilize attorneys for verification purposes, they may not be able to offer the same level of reliance as practicing lawyers who work within specific jurisdictions and carry professional indemnity insurance policies.

Potential Conflicts of Interest

Another evident drawback is the potential for conflicts of interest. When a person works for a platform that handles contracts from various customers, there is a risk that they may encounter conflicts of interest. However, unlike law firms that conduct stringent conflict of interest searches, these platforms may not have comprehensive measures in place to address this concern. As a result, if clients require legal reliance on the analysis of a contract, they will ultimately need to engage the services of a lawyer.

Delays and Increased Costs: Reconsidering Efficiency

The introduction of human verification in the process can lead to delays and increased costs. While the involvement of human reviewers aims to enhance accuracy, it inevitably introduces additional time and expense. This aspect contradicts the initial purpose of utilising an automated contract analysis tool, which is to streamline and expedite the process. The true AI approach, which relies solely on advanced AI technology, can provide faster and more cost-effective results, aligning with the objective of efficiency in contract analysis.​

Conclusion


It is crucial for businesses to consider these drawbacks and evaluate their specific needs when choosing a contract analysis solution. While the Hybrid Approach may seem appealing, it is important to understand its limitations and assess whether an Exclusive AI approach may better align with the desired objectives of efficiency, accuracy, confidentiality and cost-effectiveness.


At KeyTerms, our unwavering commitment lies in tackling the limitation of hallucination by continuously honing our models. Through the meticulous application of fine-tuning techniques tailored for legal contracts and actively collaborating with domain experts, we relentlessly pursue improvements to boost the precision of our contract analysis tool. While substantial progress has been made in mitigating hallucination and other limitations mentioned above, we humbly recognise its persisting nature as an ongoing challenge in the realm of AI-powered contract analysis.


Recent Posts

See All

Comments


Commenting has been turned off.
bottom of page