Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I was driving my Duramax Diesel truck when it started making a noise. I rolled d…
ytc_UgwR9A7_T…
G
I worked in IT Tech & loved my job. Whilst I see the value of IT in our world, t…
ytc_UgyHRxAI7…
G
@AmyThePuddytat there are, and another example would be self-driving (fully auto…
ytr_Ugz1ormsN…
G
It’s mind transfer from something to the robot body. The aim was never about rob…
ytc_Ugx70sPqV…
G
My concern is that some artists might just give up on doing their art. But that …
ytc_Ugw2uArkd…
G
It IS just a faster google search. Or a quick fact checker. But if I don’t tell …
rdc_nagw4gv
G
It's not all the factory workers that are going to be the main ones replaced by …
ytc_UgwxoUkef…
G
And also there’s no limits to AI the more they get trained the more they can per…
ytc_UgwS5J8Go…
Comment
My bet is 2035 idk why but 2027 just feels too close, also I don’t think LLM’s lead to AGI, or are these companies diverting to different architecture behind closed doors?
youtube
AI Governance
2026-01-27T14:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_Ugwz2ivQONX8sa3gLkl4AaABAg.AT4HO84HBY_AT8gPc4lk2H","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgzSOXtWS_9DS33CN214AaABAg.ASyvyU9QX2IASztrVIf1Xx","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytr_UgyAs1aPiCfu1j11b814AaABAg.ASGCGJ0gwgZAVDFCpxwICt","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugx0lbXeExXBIJm7eDB4AaABAg.ARyBi9G4JueAVDHhQ7VMCy","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgyOx4qCgvsDbF2EJDh4AaABAg.ARuItSRqukRASFLb-1WeZD","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgzinhIp_x1XMCFvnG54AaABAg.ARgT0ISqGr_ARrLvIn1zyC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugw1y2fBzBE3tOKsKah4AaABAg.ARO4jvFvHk5ASUbDtqw48f","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgyYW4k45r1Y8A-5S5J4AaABAg.ARMthV7V8F2ARNJ385kGmC","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytr_Ugzoq_28VWSRAlrR0G54AaABAg.ARMmONyRgrNARZZrlb3TmD","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytr_UgybYkKrhYMp3uaF0Px4AaABAg.ARMlMUnz7f4ARrLM90QljF","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]