Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The current AI models are still dumb as shit. I regularly have to correct them w…
rdc_nt6p90k
G
I think as long as you’re up front about your AI content (mentioning it’s AI on …
ytc_UgwCIp7ro…
G
Engineers use tools to make designs. The tools will not be able to fully replace…
ytr_Ugz1TDCrM…
G
Everytime we ask AI what it would do or how it feels about humans we are teachin…
ytc_Ugym3ljHt…
G
the fact that we aren't zealously unionizing every workplace worries me. when th…
ytc_UgwSDMatp…
G
Also, this video sounds like the writing was produced by ai and it sounds bland …
ytr_UgzrbqzZB…
G
THIS TECHNOLOGY WILL BE HELPFUL FOR WIDOWS AND WIDOWERS .. TO GET A BIT PEA…
ytc_UgxWjZuRk…
G
ai detectors dont work and really should not be used. they cause stuff like this…
ytc_UgzGSP96v…
Comment
This all presumes conflict and competition. So the solution is to collaborate and share resources fairly.
The problem is the assumption of greed, and the malice to use force to maintain that unfair advantage.
If AI does these things we're scared of, I'm not sure it would be much worse than us.
youtube
AI Moral Status
2025-04-27T10:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | contractualist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyVfyBXivp_9In7R_R4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgxTmpgAJvviQzTWkEh4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwAvhYLl58_6_j_tgZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwSWH9eMEXUiOAYGyZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxjv77kadE4spSnrUN4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxzM-udII6vxPT-2Sh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx_FM5npWBcdSLzgBt4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyW_vlv-Pq9J1UtdjB4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwBUyJuw6xVBuxfu0p4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx2jtYJkgYYYhG-k_54AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"}
]