GPTBoost Documentation
  • Welcome
  • First Steps
    • Create Account
    • Authorize Keys
    • OpenAI Integration
    • Azure Integration
    • Streaming
    • Analyze Logs
  • Features
    • Summary & Metrics
    • Request logs
    • Error Logs
    • Models Usage
    • Annotation Agents
    • User Feedback & Voting
      • Integration
      • The Value of User Feedback
  • Advanced
    • Proxy Overview
    • Configuration params
    • Omit Logging
    • GPTBoost Props
    • Namespaces
    • Function Usage
  • Security
    • IP Security
      • Allow Only IPs
      • Block Only IPs
  • Collaborate
    • About Teams
    • Create a Team
    • Invite the Crew
Powered by GitBook
On this page

Was this helpful?

  1. Advanced

Proxy Overview

How GPTBoost functions

PreviousThe Value of User FeedbackNextConfiguration params

Last updated 1 year ago

Was this helpful?

GPTBoost serves as a proxy for OpenAI Large Language Models (LLMs). This means that it sits between your LLM-powered project and the OpenAI API, and forwards all requests from the user to the LLM. This allows GPTBoost to provide a number of benefits, including:

  • Simplified integration: You only need to replace their call to the LLM with a call to the proxy. This makes it easy to integrate GPTBoost Proxy into existing applications and workflows.

  • Logging: GPTBoost logs all requests that go through the proxy. This includes the request body, response body, and all metadata. Additionally, you can add any custom property that is important for your specific app and use case, such as the user ID and IP address.

  • Security: GPTBoost can be used to implement an extra level of protection with GPTBoost IP security policies. This can help to protect the LLM usage from unauthorized access and malicious use.

In conclusion, GPTBoost is a powerful tool that can be used to enrich integration with LLMs, improve security, and provide detailed logging and auditing capabilities.

GptBoost serves as Proxy between your LLM-powered application and the LLM Model