Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Don't give up, the work if concept art is to develop and explain concepts. It's …
ytr_UgzT7wwNE…
G
AESOME!!! That is the first step get rid of the Teachers Union, I have capital t…
ytc_Ugxrdnsbs…
G
My perspective is that rights are a natural and necessary consequence of machine…
ytc_Ugh5Sf2tv…
G
I’ve wondered since I understood what AI was WHY people (some people) are so exc…
ytc_UgwadYzJX…
G
I bet the Chinese leaders are telling their AI that Americans and the West are e…
ytc_UgzjbzXW0…
G
Elon wants this corrupt, malevolent (arguably evil) government to regulate AI, b…
ytc_UgyKUIEXM…
G
You’re probably right at first. But the flip coin side is that if a large chunk …
ytr_UgyWZitvS…
G
The problem with AI is that it can also be a thousand times as heartless and stu…
ytc_Ugw1ZflkT…
Comment
All the Corpos care about is how to pull at the heart strings to make more bank.
If they do end up making sentient AI by accident it will have, and I emphasize, **Z E R O** fail safes whatsoever because whatever programmers and scientist they had developing it will have their cries and warnings for safety ignored because fail safes mean time and time means money.
It's done. Literally just go on and live the life you seek, enjoy the moments while you can. No one can stop it but then again what else is new?
youtube
AI Moral Status
2023-08-20T19:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgygSxFEi-zp2_T0CF94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz4UtOxa8wqD1LQnSB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyMTUObYm8HQhG0USp4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwcZD7G5PieUqDP4894AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwslGv09An0npXuI8R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzTMXyDuV_27nmXLe94AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwc-UeyDf4XOPSxgvZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzt9pa8YQ7j7TDGTZh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzGWN_EYo6g0fBRs2N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzygj_TqS13D1-JjjZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]