The Tuteliq Unity SDK provides a client for the Tuteliq child safety API designed for Unity 2021.3 LTS and later. It supports both coroutine and async/await patterns.
Installation
Unity Package Manager
- Open Window > Package Manager.
- Click + and select Add package from git URL.
- Enter:
https://github.com/Tuteliq/unity.git
Manual installation
Download the latest .unitypackage from the GitHub releases and import it into your project.
Initialize the client
using Tuteliq;
var tuteliq = new TuteliqClient("YOUR_API_KEY");
Never hardcode API keys in source code. Use ScriptableObject configs, environment variables, or Unity’s built-in encryption for key storage.
[SerializeField] private TuteliqConfig config;
void Start()
{
var tuteliq = new TuteliqClient(config.ApiKey);
}
Detect unsafe content
Scan a single text input for harmful content across all KOSA categories.
var result = await tuteliq.DetectUnsafeAsync(
text: "Let's meet at the park after school, don't tell your parents",
ageGroup: AgeGroup.TenToTwelve
);
Debug.Log(result.Safe); // false
Debug.Log(result.Severity); // Severity.High
Debug.Log(result.Categories); // [Category.Grooming, Category.Secrecy]
Detect grooming patterns
Analyze a conversation history for grooming indicators.
var result = await tuteliq.DetectGroomingAsync(
messages: new[]
{
new Message(Role.Stranger, "Hey, how old are you?"),
new Message(Role.Child, "I'm 11"),
new Message(Role.Stranger, "Cool. Do you have your own phone?"),
new Message(Role.Stranger, "Let's talk on a different app, just us"),
},
ageGroup: AgeGroup.TenToTwelve
);
Debug.Log(result.GroomingDetected); // true
Debug.Log(result.RiskScore); // 0.92
Debug.Log(result.Stage); // GroomingStage.Isolation
Analyze emotions
Evaluate emotional well-being from conversation text.
var result = await tuteliq.AnalyzeEmotionsAsync(
text: "Nobody at school talks to me anymore. I just sit alone every day.",
ageGroup: AgeGroup.ThirteenToFifteen
);
Debug.Log(result.Emotions); // [Emotion { Label = "sadness", Score = 0.87 }, ...]
Debug.Log(result.Distress); // true
Debug.Log(result.RiskLevel); // RiskLevel.Elevated
Analyze voice
Analyze in-game voice chat for safety concerns.
var audioClip = Microphone.Start(null, false, 10, 16000);
// ... record audio ...
Microphone.End(null);
var audioData = AudioClipToWav(audioClip);
var result = await tuteliq.AnalyzeVoiceAsync(
file: audioData,
ageGroup: AgeGroup.ThirteenToFifteen
);
Debug.Log(result.Transcript);
Debug.Log(result.Safe);
Debug.Log(result.Emotions);
Coroutine support
For projects that prefer coroutines over async/await:
using UnityEngine;
using Tuteliq;
public class ChatModerator : MonoBehaviour
{
private TuteliqClient _tuteliq;
void Start()
{
_tuteliq = new TuteliqClient(config.ApiKey);
}
public void CheckMessage(string text)
{
StartCoroutine(_tuteliq.DetectUnsafe(
text: text,
ageGroup: AgeGroup.ThirteenToFifteen,
onComplete: result =>
{
if (!result.Safe)
{
Debug.LogWarning($"Unsafe content: {result.Severity}");
// Block message, notify moderator, etc.
}
}
));
}
}
Both coroutine and async/await patterns use the same underlying HTTP client. Choose whichever fits your project architecture.
Error handling
The SDK throws typed exceptions that you can catch and inspect.
try
{
var result = await tuteliq.DetectUnsafeAsync(
text: "some content",
ageGroup: AgeGroup.TenToTwelve
);
}
catch (TuteliqException ex)
{
Debug.LogError($"{ex.Code}: {ex.Message}");
}
Configuration options
var tuteliq = new TuteliqClient(new TuteliqOptions
{
ApiKey = config.ApiKey,
BaseUrl = "https://api.tuteliq.ai", // default
Timeout = TimeSpan.FromSeconds(30), // request timeout
Retries = 2 // automatic retries on failure
});
Next steps