Integrations¶
Third-party service integrations for CodeGraph including LLM providers, security services, and external tools.
Overview¶
┌─────────────────────────────────────────────────────────────┐
│ Integrations │
├─────────────────────────────────────────────────────────────┤
│ LLM Providers │ Security Services │
│ ├─ Yandex AI Studio │ ├─ Vault (secrets) │
│ ├─ GigaChat (Sber) │ ├─ SIEM (logging) │
│ ├─ OpenAI │ └─ DLP (data protection) │
│ └─ Local (Ollama) │ │
├─────────────────────────────────────────────────────────────┤
│ External Tools │ Databases │
│ ├─ GoCPG (CPG) │ ├─ DuckDB (graph) │
│ ├─ Git │ └─ ChromaDB (vectors) │
│ └─ Docker │ │
└─────────────────────────────────────────────────────────────┘
Directory Structure¶
integrations/
├── en/ # English documentation
│ ├── YANDEX_AI_STUDIO.md # Yandex Cloud AI integration
│ └── GIGACHAT.md # Sber GigaChat integration
└── ru/ # Russian translations
└── README.md
Available Integrations¶
LLM Providers¶
| Integration |
Description |
Status |
| Yandex AI Studio |
YandexGPT, Qwen3 via OpenAI-compatible API |
Production |
| GigaChat |
Sber GigaChat LLM integration |
Production |
| OpenAI |
OpenAI GPT models |
Production |
| Local (Ollama) |
Self-hosted models via Ollama |
Beta |
Security Services¶
| Integration |
Description |
Status |
| HashiCorp Vault |
Secrets management |
Production |
| SIEM |
Security event logging |
Production |
| DLP |
Data loss prevention |
Production |
| Integration |
Description |
Status |
| GoCPG |
CPG generation from source code |
Required |
| Git |
Version control integration |
Required |
| Docker |
Container deployment |
Optional |
Configuration¶
LLM Provider Configuration¶
# config.yaml
llm:
provider: yandex # yandex, gigachat, openai, local
yandex:
api_key: ${YANDEX_API_KEY}
folder_id: ${YANDEX_FOLDER_ID}
model: yandexgpt-lite
gigachat:
auth_key: ${GIGACHAT_AUTH_KEY}
scope: GIGACHAT_API_PERS
openai:
api_key: ${OPENAI_API_KEY}
model: gpt-4o
local:
base_url: http://localhost:11434
model: llama3.1
Security Integration¶
# config.yaml
security:
vault:
enabled: true
address: https://vault.example.com
token: ${VAULT_TOKEN}
siem:
enabled: true
endpoint: https://siem.example.com/api
dlp:
enabled: true
patterns:
- credit_card
- api_key
- password
Adding New Integrations¶
New integrations should follow this process:
1. Implement Provider Interface¶
# src/llm/providers/my_provider.py
from src.llm.base_provider import BaseLLMProvider
class MyProvider(BaseLLMProvider):
def __init__(self, config: dict):
self.config = config
async def generate(self, prompt: str) -> str:
# Implementation
pass
async def embed(self, text: str) -> list[float]:
# Implementation
pass
2. Register Provider¶
# src/llm/provider_factory.py
from src.llm.providers.my_provider import MyProvider
PROVIDERS = {
"yandex": YandexProvider,
"gigachat": GigaChatProvider,
"openai": OpenAIProvider,
"my_provider": MyProvider, # Add new provider
}
3. Add Configuration¶
# config.yaml
llm:
my_provider:
api_key: ${MY_PROVIDER_API_KEY}
model: my-model
4. Document Integration¶
Create documentation in docs/integrations/en/MY_PROVIDER.md.
Environment Variables¶
| Variable |
Description |
Required For |
YANDEX_API_KEY |
Yandex Cloud API key |
Yandex AI |
YANDEX_FOLDER_ID |
Yandex Cloud folder ID |
Yandex AI |
GIGACHAT_AUTH_KEY |
GigaChat authentication key |
GigaChat |
OPENAI_API_KEY |
OpenAI API key |
OpenAI |
VAULT_TOKEN |
HashiCorp Vault token |
Vault |