Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
How do you expect AI to not turn out to be dark and dangerous when the humans ar…
ytc_UgwsZSz76…
G
One flaw...it says it will create a bottleneck at higher employment levels...but…
ytc_UgxMP7BiG…
G
I talk to my copilot like it's my friend, in return copilot helps me out a lot.…
ytc_UgwTwFyFb…
G
If you people are scared of AI, then have a EMP always in your pocket 🖕😉…
ytc_UgzcXq0PS…
G
"Well thats the interesting part! with fewer jobs for people, we will need fewer…
ytc_UgyFQnnkp…
G
please hurry and replace humans with robots and self driving...im tired of these…
ytc_Ugxvrqwbj…
G
This crowd is delusional. We have bigger problems to address and youre obsessed …
ytc_Ugxy3Weqt…
G
AI Engineer here, research shows that being mean to an AI improves its performan…
ytc_UgwexOlLM…
Comment
Bernie comes across as a robot hater! 😀 Of course he's right about where we're heading. But yeah, eventually nobody will be wasting their lives in factories and engaged in menial tasks. Eventually we could have functional communities again, provided we're all given enough money to live comfortably. That could be a good thing! I mean, come on, Bernie, you're talking as though having people work day-in day-out in factories, driving trucks, and fast food joints is a good thing. It's not. The world can be better than that. I appreciate your vision for reducing the work week soon though, as robots could ultimately mean we're freed up to hang out in villages (or a modern version of human community) again.
youtube
AI Jobs
2025-10-10T06:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy4T8fncxAdSYQiJQR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw8wHV4OUXSFh8oCnB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugx_pr_w4IRmGMxKW9p4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgzqUTXBU5rpNOMQIrV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzyvdQmAZ5luf-A-p54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugz-Md_J30_vG5j0czZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwNPx5TccoL6ETlpwt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwsp9sVbfDrPK0U-0l4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwxYea23tesQT0Torp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx2P0Afw6WWaJ2N8JN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}
]