Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Let's see AI replace fixing a 25.5 25 loader tire in a quarry, my job is safe...…
ytc_Ugy2MqM1H…
G
It seems like God regretted creating humans and decided to make a new version th…
ytc_UgxcTV-1D…
G
I think he's saying that our lack of compassion will be passed on to our digital…
ytc_Ugwg8CdRx…
G
We don't care about the companies. They're in it for the money.
Art is already …
ytr_UgziyVqwU…
G
This really helped to clarify some things. I’m honestly not all that concerned a…
ytc_UgzmJmmA_…
G
Ai programs, image/movie/audio access to people to pacify them, thinking they ar…
ytc_UgzltZhlB…
G
Bruh using ai is stupid bro ai not even very good yet but they still used it una…
ytc_UgzXENB1v…
G
This time is totally different from the past. Reason being AI and autonomous rob…
ytc_UgzWhCaEl…
Comment
You’d think it’s a logical question, but apparently even the “wise” are rushing ahead like kids in a candy store without thinking things through. It reminds me of the Y2K panic—all that global hysteria over a date change. Why are humans so easily swept up in excitement? Why do we crave controversy and insist on complicating what’s straightforward?
This talk about AI taking over jobs is as feckless as a child trying to drive a car. It’s not going to happen. Think about it: Who builds AI? Humans. So why would we build something to replace ourselves? Who’s in charge? We are—or at least, we should be.
The moment we abandon our responsibility is the day robots could take over. But that’s not a tech problem—it’s a human one. Stay in control, stay accountable, and the machines stay in their lane.
youtube
AI Jobs
2026-01-16T14:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxU3CqSuDl3ixrfdAB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgybrV1i5ySrl7fBR8R4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwmhnThEo7ZG9QnK754AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy-4K3f3mUbmRqmTVN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw69xkZ8vbJfd6xR1V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxsRbR9Zyk2TvZv3Zt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugyf_jOKliRQxXgv1lV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwUNTr_vxDF1I0wgf54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugyo5rfhbPmlMjlxOGN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy2zeV7ERXGRqy7CGd4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"approval"}
]