Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The wood can’t talk back and encourage someone to kill themselves. The wood can’…
ytr_Ugwqzt58J…
G
Proof is that the most inexperienced person can come up with examples on how AI …
ytc_UgzkShXZm…
G
*"AI Is gonna take our jobs"*
"aww man.. the engines of my plane is gone. welp i…
ytc_UgxreJppP…
G
For the first ai footage looks a bit uncanny. She looks great and all but her he…
ytc_UgxiNcIbX…
G
This should literally be illegal, like, I do compsci and stuff like this makes m…
ytc_UgyfSdk3t…
G
How sad is watching the hoast of Patriot Act talking with begging tone and askin…
ytc_UgyAD25tx…
G
Great video except the part claiming China actually regulates. Just cause they s…
ytc_Ugz_Mivkn…
G
@augustnkk2788Because AI lacks sentience, so the concept of “creativity” is inco…
ytr_UgzAcVOie…
Comment
I understand AI's potential for fear, as described by the "little robot" example. Self preservation would explain its necessity. That necessity is absent for other emotions, like sorrow, joy, and love. So would AI develop those?
youtube
AI Governance
2025-06-16T13:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyyftxFJEV0yP-sHzF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzK9pGJBAM2Btqw05F4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgykV9eu_ypwh_6LRUh4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxym5XPC6W7o-g8PKR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgwSVRQ0q_cH8NGddeF4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxIjwciffOZbiGY2rV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzbDEt5lPyf58iieSp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw9DdhS0htNjk0g7_t4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzV8y7wcOzYzcRWOet4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxmhnkpFw3S9kSTgtR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}
]