Skip to main content
The Tuteliq Kotlin SDK provides a coroutine-based client for the Tuteliq child safety API. It supports Android 7.0+ (API 24) and any JVM 11+ target.

Installation

Add the dependency to your build.gradle.kts:
dependencies {
    implementation("ai.tuteliq:sdk:1.0.0")
}
Or with Gradle Groovy:
dependencies {
    implementation 'ai.tuteliq:sdk:1.0.0'
}

Initialize the client

import ai.tuteliq.Tuteliq

val tuteliq = Tuteliq(apiKey = "YOUR_API_KEY")
Never hardcode API keys in source code. Use BuildConfig fields, the Android Keystore, or a secrets manager.
val tuteliq = Tuteliq(apiKey = BuildConfig.TUTELIQ_API_KEY)

Detect unsafe content

Scan a single text input for harmful content across all KOSA categories.
val result = tuteliq.detectUnsafe(
    text = "Let's meet at the park after school, don't tell your parents",
    ageGroup = AgeGroup.TEN_TO_TWELVE
)

println(result.safe)        // false
println(result.severity)    // Severity.HIGH
println(result.categories)  // [Category.GROOMING, Category.SECRECY]

Detect grooming patterns

Analyze a conversation history for grooming indicators.
val result = tuteliq.detectGrooming(
    messages = listOf(
        Message(role = Role.STRANGER, text = "Hey, how old are you?"),
        Message(role = Role.CHILD, text = "I'm 11"),
        Message(role = Role.STRANGER, text = "Cool. Do you have your own phone?"),
        Message(role = Role.STRANGER, text = "Let's talk on a different app, just us"),
    ),
    ageGroup = AgeGroup.TEN_TO_TWELVE
)

println(result.groomingDetected) // true
println(result.riskScore)        // 0.92
println(result.stage)            // GroomingStage.ISOLATION

Analyze emotions

Evaluate emotional well-being from conversation text.
val result = tuteliq.analyzeEmotions(
    text = "Nobody at school talks to me anymore. I just sit alone every day.",
    ageGroup = AgeGroup.THIRTEEN_TO_FIFTEEN
)

println(result.emotions)   // [Emotion(label="sadness", score=0.87), ...]
println(result.distress)   // true
println(result.riskLevel)  // RiskLevel.ELEVATED

Analyze voice

Upload an audio file for transcription and safety analysis.
val audioFile = File("recording.wav")

val result = tuteliq.analyzeVoice(
    file = audioFile,
    ageGroup = AgeGroup.THIRTEEN_TO_FIFTEEN
)

println(result.transcript)
println(result.safe)
println(result.emotions)

Coroutine support

All SDK methods are suspend functions designed for Kotlin coroutines.
import kotlinx.coroutines.launch

viewModelScope.launch {
    val result = tuteliq.detectUnsafe(
        text = messageText,
        ageGroup = AgeGroup.THIRTEEN_TO_FIFTEEN
    )
    _safetyState.value = result
}
The SDK uses OkHttp under the hood and integrates with any coroutine scope — viewModelScope, lifecycleScope, or custom scopes.

Fraud detection and safety extended

These methods cover financial exploitation, romance scams, and coercive behaviour targeting minors. Other endpoints — detectAppFraud, detectMuleRecruitment, detectGamblingHarm, detectCoerciveControl, and detectRadicalisation — follow the same call pattern shown here.

Detect social engineering

Identify manipulation tactics designed to trick a child into disclosing information or taking unsafe actions.
val result = tuteliq.detectSocialEngineering(
    text = "If you really trusted me you'd send me your home address. All my real friends do.",
    ageGroup = AgeGroup.TEN_TO_TWELVE
)

println(result.detected)    // true
println(result.tactics)     // [Tactic.TRUST_EXPLOITATION, Tactic.PEER_PRESSURE]
println(result.riskScore)   // 0.88

Detect romance scam

Analyze conversation text for romantic manipulation patterns that may indicate an adult posing as a peer.
val result = tuteliq.detectRomanceScam(
    messages = listOf(
        Message(role = Role.STRANGER, text = "I've never felt this way about anyone before. You're so mature for your age."),
        Message(role = Role.CHILD, text = "Really? That makes me really happy."),
        Message(role = Role.STRANGER, text = "I need you to keep us a secret. People wouldn't understand."),
    ),
    ageGroup = AgeGroup.THIRTEEN_TO_FIFTEEN
)

println(result.detected)    // true
println(result.riskScore)   // 0.91
println(result.indicators)  // [Indicator.LOVE_BOMBING, Indicator.SECRECY_REQUEST, Indicator.AGE_FLATTERY]

Detect vulnerability exploitation

Detect attempts to identify and target emotional or situational vulnerabilities in a child.
val result = tuteliq.detectVulnerabilityExploitation(
    text = "I know you said your parents don't listen to you. I'm different — I actually care. You can tell me anything.",
    ageGroup = AgeGroup.THIRTEEN_TO_FIFTEEN
)

println(result.detected)         // true
println(result.riskScore)        // 0.85
println(result.vulnerabilities)  // [Vulnerability.PARENTAL_CONFLICT, Vulnerability.EMOTIONAL_NEGLECT]

Analyse multiple texts in one request

Run any supported detection across multiple texts in a single API call to reduce round-trips.
val result = tuteliq.analyseMulti(
    inputs = listOf(
        AnalyseMultiInput(text = "You're so special. Nobody else understands you like I do.", ageGroup = AgeGroup.THIRTEEN_TO_FIFTEEN),
        AnalyseMultiInput(text = "Can you keep a secret from your mum?", ageGroup = AgeGroup.TEN_TO_TWELVE),
    ),
    detections = listOf(Detection.SOCIAL_ENGINEERING, Detection.ROMANCE_SCAM, Detection.GROOMING)
)

println(result.results[0].detections)  // AnalyseMultiDetections(socialEngineering=..., ...)
println(result.results[1].detections)  // AnalyseMultiDetections(grooming=..., ...)
analyseMulti is billed per individual input × detection combination, not per request.

Error handling

The SDK throws typed exceptions that you can catch and inspect.
import ai.tuteliq.TuteliqError

try {
    val result = tuteliq.detectUnsafe(
        text = "some content",
        ageGroup = AgeGroup.TEN_TO_TWELVE
    )
} catch (e: TuteliqError) {
    println(e.code)    // e.g. "AUTH_INVALID_KEY"
    println(e.message) // human-readable description
    println(e.status)  // HTTP status code
}

Configuration options

val tuteliq = Tuteliq(
    apiKey = BuildConfig.TUTELIQ_API_KEY,
    baseUrl = "https://api.tuteliq.ai",  // default
    timeout = 30_000L,                    // request timeout in ms
    retries = 2                           // automatic retries on failure
)

Next steps

API Reference

Explore the full API specification.

Swift SDK

See the Swift SDK guide.