Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The unfortunate possibility is that they might just be doing it because that is …
rdc_o7ojuwr
G
Another big problem with me is ai generated photos - makes searching for referen…
ytc_Ugxq6zPBK…
G
Not sure if I caught the point, but AI variations to the art work should have to…
ytc_UgzrrI9l1…
G
Looking back at biblical history (hypothetical), we, as species create AI beings…
ytc_UgzArKV2c…
G
Remember the Facebook experiment back in 2017 when A.I was communicating with on…
ytc_UgyHlTD8G…
G
1. If ai captures all jobs then who will buy products made by industraliasts. Ho…
ytc_UgwNDq_ts…
G
TO be fair, if it was made in 2018 it's probably more technically impressive tha…
ytc_UgyWM4G2r…
G
Why would you want to create a robot to be like humans? Robots would be perfect …
ytc_UgjeHZ3et…
Comment
You don't need some future AGI evil AI scenario for this to be dystopian. Throw some agentic AI on the bank software that processes withdrawals and let someone give it the wrong prompt intentionally or unintentionally or maybe agentic AI on the decision process for whether your insurance claim is accepted or denied. It's already happening. This technology should not be allowed for use by the general public. The problem is it's already out there in the public consciousness and now there's no putting Pandora back in the box.
youtube
AI Moral Status
2025-12-12T11:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyBAwCRBzS_XapFi5J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwv6JeVslLKcXO_K2R4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxdomAxdbGGbvvrh4Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzaYssCSyX2smPfH0R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwip_VVgxx1MsCNq8h4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw_Du3fNSCIdcRzIh94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzJSg7QXfpv21ExIhh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz8GvEt7Gm0vQIbLsN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgywAWD1gWBab0iqb4V4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzKNAgH33nc0oqO8j54AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"mixed"}
]