Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
From Gemini ( paid)
Conclusion
The video contains elements of fear-mongering. I…
ytc_UgwK9CIXS…
G
Work. Otherwise we'll have to offer further incentive to those who work, just go…
ytr_UgyMUJRV_…
G
Be smart, or you smarter to create another AI GPI which can understand that yesu…
ytr_Ugx_b3b3w…
G
@tomsound7your completely off subject mate, the narrative is about a streaming…
ytr_UgzSV3qkZ…
G
@KaterynaM_UA I agree.
I do think if they had gone the route of asking permiss…
ytr_Ugwb7I265…
G
The issue is not Ai, it is the importance and significance given to the educatio…
ytc_UgyLyENYz…
G
So these people have not seen Terminator or all the other robot destroying human…
ytc_UgwKwgwWI…
G
It’s no improvement. It’s total regression. Doomed to fail. And while people g…
ytc_UgyAIPJjJ…
Comment
AI doesn't know what the function of a wall is. AI doesn't know what the function of a door is. AI doesn't know what function of fingers on a hand provide. AI knows what things are, but it doesn't know what function anything serves. It doesn't feel guilt when it makes a mistake. It doesn't feel embarrassed when it make an error. Employer's can't dock an AI pay or enforce consequences on the AI and the AI wouldn't experience fear of those consequences anyway. The AI doesn't take pride in doing a good job. The AI doesn't feel joy when it is rewarded for doing a good job. You can't reward the AI because it doesn't care.
youtube
AI Jobs
2026-02-06T22:3…
♥ 6
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxFcHkdSebYhF_WQv14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxFw-NtIM_NIduOYoF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx1wZYu0iIfLEniDUp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz8q2zLuQyVGlDQQVR4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzlUqNNMkmOduuuXkx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz1-DKCPIT7aAsvuUZ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy16uVb89g-XpDbWqd4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwfG0pUzDx4IjzVMVd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzazIJ6OILKlnemobF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx-1T1_US7J2i9ZwKl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}
]