Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
They say "up skilling" but at the same time they are eliminating those jobs with…
ytr_UgyMil4n-…
G
Nightmare scenario #1 - AI and robots taking over without their knowledge. We’re…
ytc_UgxK9SFjK…
G
"We stole the entire internet to train our AI, how \*dare\* you let a Chinese co…
rdc_m9gid3o
G
You know that coding is closer to being replaced by AI then art making is, right…
ytr_Ugz2C7-81…
G
Sounds like all the same excuses QA engineers made when test automation took ove…
ytc_Ugxg8K9tD…
G
Thank you for sharing your thoughts. It's interesting to consider the diversity …
ytr_UgzoDa_6U…
G
In 100 years Karen's will be able to find a reason why to argue with a robot cas…
ytc_Ugy3Yfs5I…
G
@petrolcommisionsoftwitwo222.4 Creativity is the ability to create something new…
ytr_Ugwg0r_fA…
Comment
This is my idea of dangerous AI: A robot ala the "I Robot" schematic is not even "evil" but malfunctions ever so slightly. Accidentally, it hits hits and knocksover a baby carriage and then walks over the child. The same goes for those driverless cars. If you value human life do not put it at risk for "fun", convenience, or the next new "thing". convenience
youtube
AI Governance
2026-01-12T16:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgxmiU6lqG8uBYsIZWh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwKXrDbVOH5TYs0gYl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz0T_GonwaZ8l5LYUB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_Ugyk91XFkUkej2F1lB14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugy9V9c_Tm_BWfXq-CZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwp_RCj5imMLEEgmlR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxEyO3YEeGfe13fR_F4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"outrage"},
{"id":"ytc_UgyZJ2cGxCHqETHA5J94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwo16b_x29d3GIdNpp4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwq-qqnHNPibVZEpjJ4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"}]