Skip to content

trulens.core.feedback.provider

trulens.core.feedback.provider

Classes

Provider

Bases: WithClassInfo, SerialModel

Base Provider class.

TruLens makes use of Feedback Providers to generate evaluations of large language model applications. These providers act as an access point to different models, most commonly classification models and large language models.

These models are then used to generate feedback on application outputs or intermediate results.

Provider is the base class for all feedback providers. It is an abstract class and should not be instantiated directly. Rather, it should be subclassed and the subclass should implement the methods defined in this class.

There are many feedback providers available in TruLens that grant access to a wide range of proprietary and open-source models.

Providers for classification and other non-LLM models should directly subclass Provider. The feedback functions available for these providers are tied to specific providers, as they rely on provider-specific endpoints to models that are tuned to a particular task.

For example, the Huggingface feedback provider provides access to a number of classification models for specific tasks, such as language detection. These models are than utilized by a feedback function to generate an evaluation score.

Example:

```python
from trulens.providers.huggingface import Huggingface
huggingface_provider = Huggingface()
huggingface_provider.language_match(prompt, response)
```

Providers for LLM models should subclass trulens.feedback.LLMProvider, which itself subclasses Provider. Providers for LLM-generated feedback are more of a plug-and-play variety. This means that the base model of your choice can be combined with feedback-specific prompting to generate feedback.

For example, relevance can be run with any base LLM feedback provider. Once the feedback provider is instantiated with a base model, the relevance function can be called with a prompt and response.

This means that the base model selected is combined with specific prompting for relevance to generate feedback.

Example:

```python
from trulens.providers.openai import OpenAI
provider = OpenAI(model_engine="gpt-3.5-turbo")
provider.relevance(prompt, response)
```
Attributes
tru_class_info instance-attribute
tru_class_info: Class

Class information of this pydantic object for use in deserialization.

Using this odd key to not pollute attribute names in whatever class we mix this into. Should be the same as CLASS_INFO.

endpoint class-attribute instance-attribute
endpoint: Optional[Endpoint] = None

Endpoint supporting this provider.

Remote API invocations are handled by the endpoint.

Functions
__rich_repr__
__rich_repr__() -> Result

Requirement for pretty printing using the rich package.

load staticmethod
load(obj, *args, **kwargs)

Deserialize/load this object using the class information in tru_class_info to lookup the actual class that will do the deserialization.

model_validate classmethod
model_validate(*args, **kwargs) -> Any

Deserialized a jsonized version of the app into the instance of the class it was serialized from.

Note

This process uses extra information stored in the jsonized object and handled by WithClassInfo.