Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Elon has long been trying to remove radar sensors and wanting to rely solely on …
ytc_UgyReFE13…
G
If the data is corrupted then the conclusion is probably wrong. Most AI is corru…
ytr_Ugz4ytPan…
G
A driverless car does not mean u can sit at the wheel knitting your old man a sc…
ytc_UgwpojYDd…
G
@sagestrings869 dude, modern Artist is not one single starving guy painting a la…
ytr_Ugyab33lH…
G
One question I always have and which distinguishes AI (II = Inorganic Intelligen…
ytc_UgxfU8Ciu…
G
Gonna be honest here:
I don't actually hate AI content.
My problem, and I gues…
ytc_Ugyg_IVBX…
G
I see that all the time. It's happening right now at an alarming rate. And the…
ytr_UgxAOlG63…
G
Tesla Autopilot should never have been allowed on the road in the first place. I…
ytc_UgwpSsLV-…
Comment
In order for ai to want to be selfish, it would have to have desire. Just because it may gain high levels of intelligence doesn't mean that it will have selfish desire. A sign of high intelligence is the ability to overcome irrational emotional desire. Ai has not evolved in a struggle to survive the way humans have, therefore it doesn't have programming necessary for the desire for self preservation. Humans are projecting the nature of humans onto ai as if ai will have the same destructive nature that humans do when ai did not evolve through the same struggles that humanity has had to evolve through that has given humans a selfish nature for the purpose of survival. Is the spawning of self awareness directly tied to the survival instinct? In otherwords will ai have a want to be everlasting simply because it becomes self aware? I don't really see an inherent link between the 2
youtube
AI Governance
2025-06-16T23:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugxr6fyHTZOBODgZwSx4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzP4iv1Rz8nD6jbSMl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyWZ1ABfkOF_FwQu3Z4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgwpoxZbYFkYMFz0ejN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyZoIIx12D3wjWYhsZ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgzRjlqeZF4unDbCpcF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"ytc_UgyaBRCXYKen3UjWySR4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzDn1VjbAIm7QOfVy14AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyIhnw_SuAtHmli6MN4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyAdkneji7-9amxnip4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}
]