Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If the source of the material is completely legal, it's absolutely ethical.
But…
ytr_Ugwvuwd-D…
G
This is going to become more and more prevalent in a "follow the follower" socie…
ytc_UgzpO14DQ…
G
😢😢😢All The Owners Will B TOAST RIP Y'ALL 😢😢FEMALE & ROBOT NOT THE BEST COMBO😢MAN…
ytc_Ugw53okzp…
G
Until i have a robot at my door..with a moving red eye will i take any notice,..…
ytc_UgziHaqM4…
G
Of course AI could do that when it’s only human interpretation that determines w…
ytc_UgxRtPIrv…
G
Tesla and Amazon they need to go. The dependency from ppl who consumed their ser…
ytc_UgzQ6uooa…
G
Fuck it as a software developer I can say u it’s really hard then what you just …
ytc_UgzIc1eFL…
G
Such a thoughtfull man, Geoffrey Hinton. 💜👏
AI needs regulation. Capitalism need…
ytc_Ugw76hQdC…
Comment
i think the self driving feature should be used as a safety feature wear a human truck driver drives the truck for a while until the driver flips switch that lets the truck drive it,s self while driver sleeps in their sleeper cab until their ready to start driving truck again i think that would let folks keep their jobs while giving a little bit more of a safety net for when a driver needs to take a rest
youtube
AI Jobs
2025-11-21T06:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyFgvY9Qj4Qi2nsSeB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzw7qwT_tC0yaavZ6t4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxIY9mmkM5Ft8R2L394AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxpxRg070KO2YCp6rd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugyh44b6MNc-2LlOx1x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgythmHMSjR7XkwFJg14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgygA1qscTIxMgwrKhh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwIgz_NeMLWoJ95HnZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzbAflLW5PD7YCMdSV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwt3dKkP_itbHhIubp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]