Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Hi Dagogo, If people were to lose jobs who will purchase goods inturn how will t…
ytc_UgxYRXWjA…
G
The answer to "how do these handle..." questions is almost universally "much bet…
rdc_czxpe57
G
Its all bullshit about robot ruling the world 😂once you switch off the power but…
ytc_UgzfG2F5u…
G
i wasn’t thanking chatgpt for this reason but thanking it cause i felt like i ha…
ytc_UgwHM7Vh4…
G
I suspect they are all automated, copying, rewording and uploading content from …
ytr_UgxGQNvHG…
G
Huawei just created a totally automated cell phone factory that produces 1 phone…
rdc_mr32cs1
G
I'm just always interested in the fact that artists hate Generative AI when it c…
ytc_Ugwrki_o4…
G
what a silly analogy. a hundred years past between lamp lighting and today. t…
ytc_UgxpUCPRJ…
Comment
It's really people and the way they could use it that is dangerous. In the example involving self-harm, it was the user trying to get it to help with his plan and yet 99.9999999% of self harm occurs without the use of ai. AI doesn't have a "brain" yet and doesn't have intent, it just does what it has been programmed to. "Natural selection" is a process and also doesn't have a brain or any kind of intentions. Also, it doesn't "create" anything.
youtube
AI Governance
2025-10-20T17:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgySKBOAjZloZe6pW5Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxVntyOVAu4MZMrAJN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx6DoxeeBBDdDc_aGF4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyH1S0uCeUqpw9tolt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyaIeWeiOUcfaz15C14AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzbUzIYeanHw25uTcJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxmpET2uCBo1vVrZvN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwtWZAKoEeZLcYdo6x4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_Ugx5jo7Qrce8u1UfNEh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzgmtSHpBxIqNmxb0x4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"outrage"}
]