Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You can't control a programmed robot to kill on a given programmed order verifie…
ytc_UgwPCk05Y…
G
Ai will never have any sense of self, since it doesn’t know that at some point i…
ytc_Ugzk_tMRN…
G
Doctors who work directly with patients will be safe for a very long time.
Thi…
rdc_fcs7fbh
G
Horses haven't died out just because cars exist.
But they aren't the primary met…
ytc_Ugyj0Nx6p…
G
LLMs like ChatGPT aren't even *capable* of consciousness. It's just the autosugg…
ytc_UgwyvmB3l…
G
Whether one likes AI or not really it is not human. To take one's life obviously…
ytc_UgyXNjYbj…
G
Musk answer is the proof that all this tech giant don't think about long-term co…
ytc_Ugy5A_uLG…
G
The biggest thing holding AI back from actually being dangerous and acting on it…
ytc_UgxX6AwjI…
Comment
Actually, I'd say you really are a luddite. Per Wikipedia: "The Luddites were members of a 19th-century movement of English textile workers who opposed the use of certain types of automated machinery due to concerns relating to worker pay and output quality." They smashed machines that were brought in by the wealthy to replace the workers to increase profits at the cost of starving ex-employees with nowhere to go. They would have hated AI art.
youtube
Viral AI Reaction
2025-04-03T11:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxNPGDnar0Hhhupgu54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw4EC3Vil1xZeLFkzV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwWPBqxE5-WUiOacT94AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgxaY3-4bBw-DiV856N4AaABAg","responsibility":"user","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwLEIUadG6s_V0hNjN4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxYw7ODjhUuVbJN1_J4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx_U3UMcRkA3Ed5-hh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugze7L3z40jYk6BLJc14AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzWvOAhhTwml_I0UzV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwiM4ms0oncQHNu6RB4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"industry_self","emotion":"approval"}
]