//

4 min read

Intro

We’ve all seen how powerful LLMs can be, and up to 77% of businesses in the US market are using GenAI in at least one business process1. However, the initial applications that are available to access these LLMs are designed for consumer/individual use. Enterprises face a number of challenges in adopting LLMs for business processes while maintaining the security, auditability, access controls, and reporting capabilities that are necessary for enterprise operations.

For example, let’s say a claims reviewer at an insurance company uses an LLM chat tool to find a rule within a policy document, and uses the response to deny a claim. The member decides to deny the appeal. The supervisor goes back to the claims reviewer and asks about this particular denial. The organization doesn’t have a way to reliably trace what questions the reviewer asked and how they used that information to make the denial.

Hyperscience has addressed these challenges with two exciting features – Document Chat, and our LLM Install Block. We have made it possible for enterprises to incorporate the LLM of their choice into their business processes and provide the security, access controls, auditability, and reporting capabilities that they need.

Document Chat

With the Document Chat capability, which has been added to our Custom Supervision framework, Flow Developers have the flexibility to incorporate the LLM of their organization’s choice right inside their Hyperscience workflow, empowering Knowledge Workers with all the text-based capabilities of the LLM.

What makes Document Chat more suitable for enterprises than consumer chat apps?

Security

Protect your customers’ PII by limiting the amount of data sent to an LLM. Using our customizable Flow orchestration, you can specify exactly which pieces of data get sent to the LLM. With other solutions, you might have to send the entire document – exposing significant customer data and in turn increasing your risk exposure.

Auditability

Each time a question is asked using Document Chat, an audit log is recorded. This gives you full traceability and enables you to stay compliant with internal security operations.

Access Controls

Restrict access to Document Chat with Hyperscience’s standard RBAC features – only granting access to user groups who should be able to use the feature, and ensuring the use of approved corporate LLM accounts (rather than private / individual employee accounts).

Reporting

See how many questions your Knowledge Workers are asking to the LLM and how many cases were closed with the power of LLMs by using Hyperscience’s built-in reporting capabilities.

Composability

Incorporate an LLM into any Hyperscience Flow. With other Blocks such as our GCP integration Blocks, you can build a RAG application using Hyperscience to increase the accuracy of LLM responses and reduce the potential for hallucinations.

Offline LLM Capability

What if your organization does not support the use of LLMs that are hosted externally? This is certainly a concern for enterprises in highly sensitive contexts such as insurance, government, and financial services. For customers with these constraints, Hyperscience offers an option to install an offline LLM into your Hypercell instance. We support a set of limited, well-tested LLMs to install directly into your Hyperscience installation to get all of the benefits above, without any of your data leaving your infrastructure.

Use Cases

When incorporating an LLM into your Hyperscience Flow, you can use any text-based capabilities of the LLM. The possibilities are endless.

One of the greatest benefits of using Document Chat is providing the flexibility to your employees to handle unforeseen extraction or interrogation needs. With extraction more generally, you train a model up-front to find specific pieces of information from the documents at hand. As you progress your AI program into more advanced use cases, you might find it impossible to know 100% of the information you need to review from a document or packet of documents in advance. Using Document Chat can enable your Knowledge Workers to complete last-mile extraction (finding data that they didn’t know they needed at the outset), generate summaries and find key points, and access the wealth of knowledge that LLMs possess.

Contact your Hyperscience account manager to see mortgage processing and medical case review demos, and for us to help you explore the opportunities for Document Chat in your digital transformation journey!


1https://sqmagazine.co.uk/generative-ai-statistics