opaqueprompts-chat-server

August 31, 2023 ยท View on GitHub

Description

This is a chat server that uses the OpaquePrompts API to build a LLM application protected by confidential computing. OpaquePrompts is a privacy layer around LLMs that hides sensitive data from the model provider. You can find a deployed version of this chat server on the OpaquePrompts website.

OpaquePrompts LangChain Integration

This application is built with OpaquePrompts LangChain integration.

To use OpaquePrompts, once you retrieve an API token from the OpaquePrompts website, all you need to do is wrap the llm passed into LLMChain with OpaquePrompts:

chain = LLMChain(
	prompt=prompt,
	# llm=OpenAI(),
	llm=OpaquePrompts(base_llm=OpenAI()),
	memory=memory,
)

Note that the source code also includes logic for authentication and for displaying intermediate (i.e., the sanitized prompt and sanitized response) steps.