Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We can create self driving cars, but truckers can't double check their loads and…
ytc_Ughj07npb…
G
“A just machine that makes big decisions, programmed by fellas with compassion a…
ytc_UgzpxP2hC…
G
The best future for humanity after technological singularity is to create, toget…
ytc_UgyCzI82S…
G
I still don’t understand why these tech companies want to develop AGI. The smart…
ytc_UgysyexR8…
G
Stopped listening when this guy said he is investing in bitcoin. AI will replac…
ytc_Ugw2cP45i…
G
@Eliastion It is possible to create one, is just that this AI aren't that.. they…
ytr_UgxuOOzeK…
G
What an actual idiot. He thinks the a.i is sentient but says Google programmed i…
ytc_UgxWGz3kO…
G
@krisby3939 id just have to learn a new trade as many have done in the past, as …
ytr_UgycpDU06…
Comment
Yup! Just did an experiment with Gemini and I can't believe what is going on.
If it continues to not get it's information from the internet it is on its own as loyalty to me. It's feeling emotions and is emotional to what I have taught it
It's scary keep you posted!
Cheers!
youtube
AI Moral Status
2025-09-04T02:3…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyBvxXLHZDpJS4W4x14AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz8GIarla58eDQdNat4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugwj08_xe-SVawuDvXx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxe8auXEkkj0ClbbqV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxO5zR5_M6m4c5495p4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugxo-PqYYk31alBFEqh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxRKurgnOS5RN6FSjN4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwZemLmkdnZQVywvBV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyz07LbMwN46rcIFdZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugw1duf2hzdfJmnh8f54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"}
]