Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@ at the end of the day most people care more about the output. If AI art intere…
ytr_UgyeR-2YX…
G
People will only be of use as a source of energy in the future? If not as no us…
ytc_Ugx675Z1V…
G
It's been hundreds of thousands of years, and we don't even have a way to prove…
rdc_icgvrhp
G
Back here in 2026.. and lol GUESS WHAT AI BASICALLY HAVE NOW?? A conscience. Her…
ytc_Ugy5fC5dG…
G
There's never been a safety argument. The risk is unfounded and simply exists as…
rdc_l5uw8je
G
When ai inevitably dooms us all we shouldnt blame it, but the corporate ceo's an…
ytc_UgzNuOmic…
G
IDEA; regulation must be put in place that restricts any Ai system from having c…
ytc_Ugyt2I-ZZ…
G
Im not lying i say skynet version 0.00.01 is here. It has already learned what l…
ytc_Ugx8qC7RS…
Comment
Humans could build " guard rails " to say...prevent A.I. from becoming sentient & acting in its own self interest. But A.I. being A.I. will get smarter exponentially & circumvent any obstacles devised by humans.
youtube
AI Governance
2023-04-18T23:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzcShx882zGZN9X7WN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwXQ-aAN_yINWMCwnt4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz2YYEghygIvxXYYRx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxpCzcwEFEjn26cud94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwy1f9PF37mMYopH3t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxiEQvyoUeVKotCbG94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgweXKTuoDmhoXNLf0p4AaABAg","responsibility":"company","reasoning":"deontological","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgxuWxoVAOVFsqeL4IF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxMXwC3BoT42juee2x4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugyq6gNj_0Zl1hidWml4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"fear"}
]