PyPI page
Home page
Author:
None
License:
The MIT License (MIT)
Copyright (c) Protect AI. All rights reserved.
Permission is hereby granted, free of charge, to any person obtaining a copy
of this soft...
Summary:
LLM-Guard is a comprehensive tool designed to fortify the security of Large Language Models (LLMs). By offering sanitization, detection of harmful language, prevention of data leakage, and resistance against prompt injection attacks, LLM-Guard ensures that your interactions with LLMs remain safe and secure.
Latest version:
0.3.16
Required dependencies:
bc-detect-secrets
|
faker
|
fuzzysearch
|
json-repair
|
nltk
|
presidio-analyzer
|
presidio-anonymizer
|
regex
|
structlog
|
tiktoken
|
torch
|
transformers
Optional dependencies:
autoflake
|
llm_guard
|
mkdocs
|
mkdocs-autorefs
|
mkdocs-git-revision-date-localized-plugin
|
mkdocs-jupyter
|
mkdocs-material
|
mkdocs-material-extensions
|
mkdocs-swagger-ui-tag
|
optimum
|
pre-commit
|
pyright
|
pytest
|
pytest-cov
|
ruff
Downloads last day:
5,905
Downloads last week:
64,406
Downloads last month:
293,411