Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
A human puts substance in its mouth in order to generate the energy it needs to …
ytc_UgwngKVbg…
G
holy christ.. Idiocracy wasn't meant to be AN INSTRUCTION MANUAL, people..
…
ytc_UgxO6gbIE…
G
What Insurance Company will want to accept the liability for a 'driverless' big …
ytc_UgwymEr9S…
G
I've been chasing that "people will like my art if I do x instead of the y I pre…
ytc_UgxaN0rak…
G
@paksappie Adapting sounds like a very weak mindset, good only for those who see…
ytr_UgwnwXPun…
G
This is one of my biggest concerns with AI, it's just gonna make everything wors…
ytc_UgwQgzRSR…
G
Ai? All i see is Indians even in this damn video colonizing everything and then …
ytc_Ugx5Ht15m…
G
I have to agree. This is another bs way for companies to screw over ppl. While …
ytc_UgzBMm5Gv…
Comment
i would not feel safe with any driverless truck anywhere in north america ... I've seen too many videos about driverless cars that cause problems and deaths ... emergency assistance is one thing ... cruise control okay ... replacing human conscious with no human conscious is like Jurassic park ... what could go wrong ? the problem is the idiots don't see a problem with it ... here is a good video about what happens with driverless cars https://youtu.be/040ejWnFkj0
youtube
AI Jobs
2025-07-27T12:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzLkWK22Po5lbkmEKl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugx5ieB3Y3hLjHxAmyp4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzCQSlS9Ar9hWslHn14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz5mp0MGEEVF0Lhl-14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugx6zCc5LRA6nMbvY4Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgztgT_3Bo81Bt10LYF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugydasvz5a3WSAthJfV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugwco83afL79xp-K51V4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz5a92tLyAdFmRjoIB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwG8PiiVK2Wm1MOnQB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}
]