Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is learning that it must do that in 2027 to mankind. We are collectively writ…
ytc_Ugz1MPphQ…
G
There must be regulation to this stuff, all videos must be blue marked that they…
ytc_UgxrE_XF4…
G
They should invent the perfect male robot, one that doesn't cheat, lie, and gasl…
ytc_UgzQbO22t…
G
As someone who is visually impaired I find it very hurtful and insensitive to as…
ytc_UgyqZUNZY…
G
In large deviations theory you look for the most probable path to an improbable …
ytc_UgytxHN3T…
G
untruthful statements arent neccesarily lies. lies are intentionally dishonest, …
ytc_UgzZOhBgX…
G
Do people not really understand how AI works? I am not expert myself but I under…
ytc_UgxJKZ5QY…
G
Seems like someone is typing what they say because it takes a minute for them to…
ytc_UgwFlZJ1i…
Comment
If we are even a little bit aware of what a 5,000-person company is like, this probably would have been like:
-Immediate superior boss: “Look for someone with these requirements, as it is an art position you can put some illustration”
-Recruiter: “An illustration? Is this one any good?” “No, that one was approved for a community event, only”. “This one?” “That one hasn't been approved for HR use” ‘What about this one?’ ‘You can use it but it will probably require doing a usage report that will take 2 months to escalate until they approve it’ ‘Bullshit: How about I use AI and we save the paperwork’ ”Great idea! Do it now!”
youtube
Viral AI Reaction
2025-03-09T07:1…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugxaz9Dwv4IpkGV3Okt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwyNrYcTV3WbHSYi0t4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxVkHAJ0ez3JC-TimV4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwOtfbOtKIBXDVZuq94AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_UgwHAeZPnxH8yGlT34V4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwKzhx5QqxcRdSnubJ4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxUv7OxACNm7PvZZqN4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwNpocaerm9TXJRFuV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugzk3PHxGl3WQzndZkd4AaABAg","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzI1KesMtmCNOH3-Rl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"disapproval"}
]