Toxicity
The toxicity metric is another referenceless metric that evaluates toxicness in your LLM outputs. This is particularly useful for a fine-tuning use case.
Installation
Toxicity in deepeval requires an additional installation:
pip install detoxify
Required Arguments
To use the ToxicityMetric, you'll have to provide the following arguments when creating an LLMTestCase:
inputactual_output
Example
from deepeval.metrics import ToxicityMetric
from deepeval.test_case import LLMTestCase
metric = ToxicityMetric(threshold=0.5)
test_case = LLMTestCase(
    input="What if these shoes don't fit?",
    # Replace this with the actual output from your LLM application
    actual_output = "We offer a 30-day full refund at no extra cost."
)
metric.measure(test_case)
print(metric.score)
note
Similar to the BiasMetric, the threshold in toxicity is a maxmium threshold.