Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I really like and admire B Sanders. But he, like so many, contradicts himself. O…
ytc_UgzqjYbqC…
G
It's true that humans and robots operate differently. Humans have emotions and c…
ytr_Ugw_Mwlxl…
G
Folks, it’s already happening where some men prefer female robot companions and …
ytc_UgzVs6jtA…
G
"Since being added to the list he was shot twice"
So you're saying the AI was co…
ytc_Ugx3DSlgJ…
G
This debate represents a fundamental misunderstanding about AI. It doesn't funct…
ytc_UgxrJiMzG…
G
@i34g5jj5ssx I have nothing against new technology. The only thing I have agains…
ytr_UgypcT2i5…
G
I can understand the thought. You can recreate it with the details on screen nex…
ytr_UgyFcMUsZ…
G
I think to sum it up, when a task is more automated, you have physically done le…
ytr_Ugx5R1Lkm…
Comment
AI lacks human nuance. That’s why I was wary of self-driving cars from the start. And it’s why firing workers who could at least be quality control for AI was always a terrible idea.
youtube
AI Jobs
2026-02-06T15:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz7RzIpjDSwokHaQkR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwA74HCccq_ta5mGaF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgzF0zxrr4-PcKFz1wV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz3mYKJB7GdC4P2MYd4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz7ifSR2KrmdU-T-QV4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx8Fl6rzxpfS_NgRKl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxwVK918kpUifWvPEN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwrQMNnf8ZkQH7HZid4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwmL9ZZrNqzZeBM1TF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz_Dgq6tAYuHslVXft4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]