llama-cpp-capacitor vulnerabilities

A native Capacitor plugin that embeds llama.cpp directly into mobile apps, enabling offline AI inference with chat-first API design. Supports both simple text generation and advanced chat conversations with system prompts, multimodal processing, TTS, LoRA

Package versions

14 VERSIONS IN TOTAL
versionpublisheddirect vulnerabilities
0.0.2212 Sep, 2025
  • 0
    C
  • 0
    H
  • 0
    M
  • 0
    L
0.0.2111 Sep, 2025
  • 0
    C
  • 0
    H
  • 0
    M
  • 0
    L
0.0.1331 Aug, 2025
  • 0
    C
  • 0
    H
  • 0
    M
  • 0
    L
0.0.1231 Aug, 2025
  • 0
    C
  • 0
    H
  • 0
    M
  • 0
    L
0.0.1031 Aug, 2025
  • 0
    C
  • 0
    H
  • 0
    M
  • 0
    L
0.0.931 Aug, 2025
  • 0
    C
  • 0
    H
  • 0
    M
  • 0
    L
0.0.831 Aug, 2025
  • 0
    C
  • 0
    H
  • 0
    M
  • 0
    L
0.0.730 Aug, 2025
  • 0
    C
  • 0
    H
  • 0
    M
  • 0
    L
0.0.630 Aug, 2025
  • 0
    C
  • 0
    H
  • 0
    M
  • 0
    L
0.0.530 Aug, 2025
  • 0
    C
  • 0
    H
  • 0
    M
  • 0
    L
0.0.430 Aug, 2025
  • 0
    C
  • 0
    H
  • 0
    M
  • 0
    L
0.0.330 Aug, 2025
  • 0
    C
  • 0
    H
  • 0
    M
  • 0
    L
0.0.230 Aug, 2025
  • 0
    C
  • 0
    H
  • 0
    M
  • 0
    L
0.0.129 Aug, 2025
  • 0
    C
  • 0
    H
  • 0
    M
  • 0
    L

Automatically find and fix vulnerabilities affecting your projects. Snyk scans for vulnerabilities and provides fixes for free.

Get started free