Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I thought it was ruled that AI works couldn't be copyrighted?
Maybe it doesn't …
rdc_lz5n90o
G
If you have to make a robot, at least make them look like the Terminator💀💪. I w…
ytc_UgylJfqiS…
G
2:36
If we choose to stop AI progress there is a 100% chance we all die. Bad a…
ytc_Ugwj-89Be…
G
I want to see a self driving car in a city like New Orleans and see how well it …
ytc_UgyCfhi0v…
G
How can you expect consideration of AI welfare when we barely consider HUMAN rig…
ytc_UgwFf8Gsh…
G
The real thread is human, believe AI will far better than us developing a peacef…
ytc_Ugxsj4RIg…
G
Probably a great podcast but youtube is showing me the same advert every 3 minut…
ytc_UgxhQGT1z…
G
What this does tell me is we’re fucked with ai. Mow that Trump has approved Grok…
rdc_ngsp84q
Comment
I have been a professional software developer since 2000 and get so mad when I read interviews with AI CEOs and CTOs. The Open AI CEO said several years ago that it is a matter of time before the entire planet is covered with data centres. No thanks. Or a recent interview with another such idiot saying people will just be robots made of meat to be controlled by AI. These people are psychopaths.
youtube
AI Moral Status
2025-06-06T04:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | virtue |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwKwE-r9CjvJbC8E5d4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxMGQuLtuO-fvzAPYp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwcfFPO1reuKCP54al4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw8aWMi3qEroTuqLWN4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugxca-nXQo3QXccwo-d4AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgysQ7I4GeeKJGhQUJN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwYbrpb6sJRyih6Wvd4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx5TkJzYnyogPr8QwB4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzYgv8IoSG6QImIXt54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugyw0C4Gm3dxzKAtul94AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"fear"}
]