Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This has to be one of the most... irrational videos I've ever seen. You are tryi…
ytc_UgxvQ8G9x…
G
Exactly.....just the fact they think M is out for their good, is disturbing....i…
ytr_Ugyn-eShO…
G
Holy fu- that engine room part was like gta v 😮. I think Rockstar delayed 1 year…
ytc_Ugy8jirOi…
G
It’s called “hallucinating” when AI do this. They are not intentionally lying to…
ytc_UgyK3z2Fm…
G
Autonomous IA is frightening. Even if it never leads to a large scale war, I fin…
ytc_UgyyYfDnF…
G
Hahaha right, Liberia is a failed state, there's no effective government. Whoeve…
rdc_ckqccrg
G
Why should I want a Car that sees like a Human a.k.a. Vision, It's a Robot make …
ytc_UgylTauvP…
G
In the end, its way cheaper to automate something than to utilize human labor. H…
ytr_UgzxTutlR…
Comment
The cargo comes to us on giant boats with a very small human crew. Much of it is taken by miles and miles of trains with a very small human crew. It is only this part of the process that employs so many people. And humans driving trucks is incredibly dangerous. Most drivers are awesome. But I have seen so many accidents caused by big rigs in my life that I welcome the safety that will come from self driven trucks. I just hope they don't rush it. People need retraining and we need to be sure they can actually cut down on accidents. Most importantly, every company that saves money by moving to automation needs to reinvest in the workers they lay off. They need to pay for retraining and job placement.
youtube
AI Jobs
2025-05-28T19:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxwsMuvmLIF-vLcOtB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxWv07Aig6cQn9TL9Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzkE36-2OPGjmHor9d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwG6VMWYrV_AAIzmoJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwmMoLvd_bEh9MD6i14AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwSQREqhZV5DBqGfft4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwKbB7wRs8fe9BKWjN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgykxKo4O-rXC3pPXDV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugyo9yhPf4PIooNoCj94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwOWPdf8W8PPpkj6JJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]