Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Humanity needs to really think about its future. Will AGI and Robots and Automat…
ytc_UgxUQc9Rm…
G
this is so fucking ture, im so done with people crying about it being unethical…
ytc_UgzK4rRJo…
G
It also ignores the history of how silicon valley was really founded. The DOD fu…
rdc_oi1fq1u
G
I pay for YT premium for no commercials period the reason I have NOT SUBSCRIBED …
ytc_UgwB6OJ30…
G
A year later tesla launch theor first human robot. Too much for marking his word…
ytc_Ugx3MG22w…
G
Reading the comments here is restoring my hopes for the future. Only demented te…
ytc_UgxTDwlnJ…
G
> The new rules require platforms to label AI-generated content with markers …
rdc_o19ixmn
G
This question isn't meant to check whether you're a robot or not because "robots…
ytc_UgyAZp2uD…
Comment
Here in the divided states the maternal instinct has been stomped into the ground. There's a brutal war going on women since October 91. Our fascist tech bro oligarchs say empathy is a vice, a fault, a negative trait. I love how man is destroying the s*** out of himself In every way possible. The environment, war, AI.
youtube
AI Governance
2025-08-14T13:3…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzRtarstrWwRQMu22B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugwan1_9DvbcG1rRH_F4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyROXPeek-vNeZ3gTR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzZqjoPmfec_UbUZyB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugx0MY7jTLC3DrmuIdl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyhET_KY1Z7alju2wd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx7m3TG2agDZKUskWN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgySqql1ODiK3_Nl19N4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyBVV-hqlslcb-LIk14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx0Erj-kHymCzqunUN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]