Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The AI was talking about cleaning not practising medicine, the user just decided…
ytr_UgyxeheZr…
G
I'm a lazy man, and you couldn't _pay me_ to use generative theft.
Because 'AI a…
ytc_UgyI0r3_6…
G
Moore's law is irrelevant if you only look at it as counting transistors. The th…
ytr_UgxakbXid…
G
You managed to address a hidden fear I've had about AI that i havent really seen…
ytc_UgxgymZqx…
G
Everything is dangerous. People, animals, objects, nature ,space, ideas, machine…
ytc_UgxvnVnd6…
G
Musk isn't concerned about AI safety. He's a salesman for the NWO Digital ID AI …
ytc_Ugw_i_Y5J…
G
Outsourcing speed up due to more people wanting remote jobs, if your can be done…
ytc_UgxoacNSX…
G
He is not addressing any of the concerns. Just talks around them. I don't think …
ytc_UgwJWmqNx…
Comment
Perhaps it's too much to ask for companies to hire multiple philosophers with varying approaches, but I think your worries are well-founded. In essence, the person or people who hired a single philosopher of a single school of thought already made the most difficult philosophical choice in the shaping of AI. I wonder if they hired any philosophers to help them make that decision. 😅
youtube
AI Moral Status
2026-04-23T15:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyapTP4_hx4QcrYjd54AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwGsb_77uPcxYStb1l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzOmT2KHic54-oGmjd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwGzaFeAI9tAwCKFr54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwDfaFGCOu069Edaa94AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyGOOI7YvKnR7ZZ0o54AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxSzRHlF0fKDiiytyt4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyMv9wsTXn-rjPApiJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzewOqwiGgp2Sp3P1h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw6lEMQc-E2j5OVFwN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]