Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AICarma is essential for tracking AI mentions; it prevents my brand from missing…
ytc_UgxXv7tpu…
G
We live in an era of shifting our life activities from the real world to the dig…
ytc_UgwoX5GrL…
G
No it will not. We have been using customer service bots from a long time. Did …
ytc_Ugz0cdZYl…
G
This is pretty rough...
If you want to see some well informed takes on AI art I…
ytc_UgwXn5gwA…
G
The second example is bull the AI had a good reason for assuming he might be in …
ytc_UgzX1f1Z2…
G
Well, Amazon’s laying off thousands of people because they have AI now it’s alre…
ytc_Ugw4D7De6…
G
I hope AI takes over, if its the correct AI we could become a pampered species. …
ytc_UgxKiCFca…
G
It's understandable to feel concerned about the future, especially with rapid ad…
ytr_UgyGl-foH…
Comment
@babatundeonabajo No, that's what the criminal law is for. This wasn't just allowing it, they encouraged it. I'm the first to defend from things like the Online Safety Act and Digital ID, but where a platform explicitly designs the platform to promote content, they are liable for what they promote (which is different from passively allowing it).
In what world can a parent know everything a teenager is doing? I agree parents have a part to play in this, but these platforms profit of algorithms they don't understand and promote content doing so. If that content is unlawful they ought be liable for it, they aren't merely not moderating at that point. If I publish in a newspaper a dare to hang yourself until you pass out, and I cause some kid to die doing it, I will be guilty of manslaughter.
youtube
2026-03-26T18:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytr_UgxX6AfltRTf1YfUW5B4AaABAg.ADLhRLi1qoBADNNN_hnWkt","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxX6AfltRTf1YfUW5B4AaABAg.ADLhRLi1qoBADipb62xViU","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugw3CyZDV9YlkrGnf1h4AaABAg.AUne0JokrwXAVYy4zMpDYN","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugwdq9a7bdkAbwCDr754AaABAg.AUnbNCeoe7EAUpS9aR-y-7","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytr_UgzNr3CuaUO3mefzQmJ4AaABAg.AUnP0np5DpUAUnlLxnFMGk","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytr_Ugxk56c9V9jM8QGTXYx4AaABAg.AUnKMCZ7EU0AUnrc2qLLeZ","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugy1sUWWx3tmjaGrJsN4AaABAg.AUnKGx5A4THAUnQF_OQt7A","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytr_Ugx36kc6X90CCFqAjvx4AaABAg.AE14ZwLez1wAEMfr7_1ZAH","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgyeSdL1WoJvfaww3m14AaABAg.ADfDPFrp3OQAEMgwFVCGBg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzvI29CvkiqWq_9TiR4AaABAg.ACexzk_morVACeyGiVrtM0","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}
]