Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If this type of crap is why all of the driverless cars should be banned complete…
ytc_UgxVGWM4s…
G
One thing is "I got a picture fine enough for my use" and other is pretending yo…
ytc_UgxKecj9I…
G
When you said you'd appreciate it if we subscribe. For some reason I decided to…
ytc_UgzzgbFAG…
G
At this point, I don't think most of us will live long enough to see this happen…
ytr_Ugzg827JA…
G
The law to ban states from regulating had to do with the Race for ASI with China…
ytc_UgyRxr-Fy…
G
Hey there! Sometimes, even with advanced AI models like Sophia, there can be min…
ytr_UgyjE1l60…
G
blah, blah, blah ..... part of democratic process ..... leadership and AI ..... …
ytc_UgwqLL9nH…
G
Multifunctional computer chips have evolved to do more with integrated sensors, …
ytc_UgwGD6ldK…
Comment
When he mentioned how it would help “make” better drugs for health care, I immediately thought back to how he also mentioned it could deceive in order to protect its own existence. Who’s to say AI wouldn’t have us develop something that would inevitably destroy us? Then again, who would be to blame considering we’d invented AI in the first place. It is definitely a tool and a loaded weapon.
youtube
AI Governance
2025-12-31T13:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugym9ohMefdr3NBkIq94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx3xTJgDfXCx269mmN4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxslI1nvO3Q7ZU4evV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugz9GZi5gcKUUY7kvuZ4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxmo4ZqXsxDZL-vn414AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugykimhl874RMcy6aOp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxZBia6ojYYKc9_Tmt4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxztAjw1G5UxnteRlV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxNLh7ASyRH7ywQtXR4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugxxai-ozeJcpe4dLMh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]