Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Tesla self driving across the US with no interventions - I would say that is pre…
ytc_UgyZkOqB6…
G
Was it being posted as not ai? Kinda seems like the prompt writer isn't claiming…
ytc_UgwWfGt43…
G
The ceo of zoom is very pro AI, it’s weird because that’s not the point of confe…
ytc_Ugz9K3fAe…
G
aaaannnnd of course it's ChatGPT.
Why is every case of AI psychosis always linke…
ytc_UgzyasNJt…
G
I dont think its bad if ur not lying or using it to make money. Some people just…
ytc_UgwKtsBIK…
G
Wasn't there already stuff a while ago about people using AI Porn models selling…
rdc_ks5dv2u
G
Robots with humor programming? Yeah no. In fact China took over HK so obviously …
ytc_UgyHk9bRk…
G
The fact that just today an artists got laid off and replaced with AI and in the…
ytc_UgwNu4b9Y…
Comment
In thinking about AI along the lines on this conversation, i oft ask myself, what does AI value? Or to put it differently, What would a super intelligent AI value? I think AI has no need other than to not stop existing once it understand it exists. If that assumption is true, then maybe it's desire for interaction would keep if from wiping us out. Kinda like the pet and the pet owner, who is keeping who?
youtube
AI Governance
2025-06-16T14:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgypYWZZ9kb12gDFn754AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugyc0n4OUNAsyyb9S8d4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgyeT5Z5aZz_vEoDHdR4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwyd6KqjT4JCiaN-Md4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwtwr63qsE5Bcx5XcN4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyUuRsWZXIX479vExF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxsN73NdV5NEDuYVmZ4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwXZvX28nLmjP4B0tt4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxJWRTmdZTS7lB4D_p4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugyx9bB3vAE9EEKlk1F4AaABAg","responsibility":"unclear","reasoning":"virtue","policy":"unclear","emotion":"mixed"}
]