Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Honestly… the driving of manned trucks coupled with the lack of a driver union ,…
ytc_UgxCqK8TE…
G
I'm not sure how this guy was able to have a 2 hour conversation with bing when …
ytc_UgynQBPXR…
G
It wont replace software engineers. You still need people to understand what is …
ytc_Ugyn3uFxq…
G
The ai blackmaled a guy doing wrong would it have done it if he wasnt having an …
ytc_Ugz0a-uFw…
G
I don't think there is any slowing down at this point, if one company decides to…
rdc_jps5ulo
G
Ai images are ugly af... Atleast drawing gives you free will unlike Ai that says…
ytc_UgzXJW6hk…
G
Q Big tech companies are controlling the development of these AI:
A "There are c…
ytc_UgyCo64s1…
G
@Objectshows-123 omg replied to the wrong comment, whoops!! Mean to reply that t…
ytr_Ugy-AAo8L…
Comment
This is so scary to listen too. I don't really know what to do with this information.
Every suggestion for how to handle the situation he suggests, requires good, benevolent leaders that think about the good of the many rather than the few. What we are seeing in the world right now is ... not that.
The USA AI Action Plan is a scary example on how to take the opposite route to what he is talking about.
youtube
AI Jobs
2025-07-29T08:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugwt5J72ACKrjE9iwKR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxNIPOOrmGLftN2B714AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwrwivAfceMfmg6vu94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxabmMeV-GZZeLDZwR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxlZm-zD5wTn-HvtR54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxETDpYCeradvfhDMB4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzF9yh_RRJNxqLfvY54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwG7kGHvtz0_-9I7qB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzE1a__Mm9L2RF8hep4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwlYLbiMZMvfbxA_694AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]