Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
until proven that those incidents were actually autopilot hitting them then ever…
ytc_Ugy3SrPcK…
G
The purchase of goods and services was needed only to incentivize people to perf…
ytc_Ugxl5ZJa8…
G
The very huge mistake of this world was AI and the man who created it, very stup…
ytc_Ugxt6uGuB…
G
Total BS
If 99 percent are unemployed then who is buying the products and servi…
ytc_Ugwx_ijuG…
G
Hi everyone, I understand that the robot bring us several benefits to us and I a…
ytc_Ugz0kPodQ…
G
there is actually no need for us making robots look like humans, im sick of this…
ytc_UgzDbD9Fh…
G
Thank you for your comment. If you have any questions or need clarification on t…
ytr_UgwuHBxMZ…
G
They have a 60-day stoppage, but who informed the AI that there was a stoppage? …
ytc_UgztrTUUO…
Comment
All of this is scary but I feel like the registered nurse replacement might be the scariest thing. I do not WANT an automated hospital. I do not want to have to trust that a computer is going to be able to intuit what is happening to a person's body or what needs to happen with their care. Between this, the failures of automated vehicles, and the loss of employment without a replacement route, this is going to get people killed.
youtube
AI Jobs
2025-10-08T18:1…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgwFMtlUUmfsfS9fDaV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzOwin7lMRnGv6mDJV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugw5ExaYhcFLnIptxw14AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwvYJEkUugncEMIl614AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwjDNHHRXxXYhHqGpR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgysQVqucCxoOlJ8uj94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugws9b9JwcCCOAnJw6R4AaABAg","responsibility":"government","reasoning":"mixed","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwSF2Lp6Ysb0I0ONqd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgztU_xm9qcOJJHzJnp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxXdQx6bIulwY--SlN4AaABAg","responsibility":"company","reasoning":"mixed","policy":"regulate","emotion":"outrage"}]