Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Hollywood is gonna use AI as much as it can. Actors and writers can refuse allow…
ytc_UgwWMBFNn…
G
What does the future hold for developing countries in the age of rapidly advanci…
ytc_UgzQCi8bU…
G
This guy has entered the panderverse on a whole new level blaming AI now. What n…
ytc_UgxzKG2IA…
G
I agree with you
I dont like this AI fiasco either
But MAYBE if you weren't so a…
ytc_UgwduLh9U…
G
As Bill Clinton said, "A race to the bottom." The trend is newer and newer tech…
ytc_Ugxsvd-OF…
G
AI is just like fire when pre-humans discovered and experienced it, it became a …
ytc_UgzG_YqBD…
G
Last year we caught our then 12 year old using a similar app, where she chatted …
ytc_Ugx9su-r_…
G
Literally one of the biggest reasons why AI is so annoying is because you can ju…
ytc_UgyeC9Bbm…
Comment
We are soon going to Lear what happens when you have a purely secular society. AI is always being created as atheist and we presume to try and impart some sort of objective moral code (think prime directive or the “three rules”). The answer is give AI a Bible as their starting point and then test to see how a “Christian” AI behaves. I’m sure if someone would do this test they would find out that solves the problem and P(doom) would drop to 0.
youtube
AI Harm Incident
2025-07-24T02:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyrqXcrK0Bl3jzhk014AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzOiGrPFnUIcxTwgAh4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwz5Mcx4b4WwYTGfk54AaABAg","responsibility":"distributed","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz-zgpZlUxWqewzUPB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxNDshZ0sZvQPoEmU54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugzztps1oWUMNEpqEaF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwHyQhrqSE760-4KyZ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyudRwktUnvc6dflaF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyMvCIOCaYc92Cl2nV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy5n9zhFvQe2xnipOB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]