Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Meta has broken ground on a demon center in Tulsa Oklahoma. Cherokee Nation rese…
ytc_UgylcnMdU…
G
ai learning from art is no different from a human artist learning from watching …
ytc_UgzpfHh3T…
G
*"AI Is gonna take our jobs"*
"aww man.. the engines of my plane is gone. welp i…
ytc_UgxreJppP…
G
Usually economic collapse is the result of no productivity. That's why war time …
rdc_ktxo3hb
G
Isn't that main issue? The AI's essentially using existing material from everyw…
rdc_jj4hnlt
G
Just for information, Glaze and Nightshare are both automatically removed when A…
ytc_Ugx-NLoT8…
G
Scariest thing to me is the idea that an advanced AI can edit its own code to be…
ytc_Ugx0hGax6…
G
Skynet is gonna be what we call the union formed by human employees and ai emplo…
ytc_Ugy5meRrr…
Comment
The differences between this tech and nuclear power are:
1. Nuclear bombs don't have a brain, they can't decide what do to with themselves
2. AI is not as destructive as a nuclear bomb, but rather disruptive, causing changes in the basements of civilization and culture, as opposed to sheer direct destruction, but:
3. AI could manipulate and trigger nuclear bombs if it decides to do so
youtube
AI Governance
2023-05-10T16:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugwt_9L9s1cMQUynBS54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgyrnILZIDheb5StTxZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwirKaD5DeF4wOgyix4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgznWHPo76axNk8mq3Z4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx7HVGwgSwBNqF2vg54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwKvDsRLjYiqlACyCl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgydmufFqaE_TWOpTjJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugy03Q6WrXntB6U3Jvh4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyuS5CJK6qFISPm8wZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw526q5I6bZRoqu6px4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}
]