Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Is the idea of driverless trucks to deny a hard working driver a job to provide …
ytc_UgyC-jb03…
G
Well now this is just a simple problem of contradictory data. If we want the AI …
ytc_UgyQWgOk5…
G
the Irony is the best brains and the most powerful leaders in the world are work…
ytc_UgynbrvqS…
G
In each and every case in which a law enforcement or any organization arrests or…
ytc_UgzFm_W3A…
G
@jjer125Yeah, it could be superior and plant-like (minimal conscious-like state…
ytr_UgzUrlFSr…
G
Work what jobs? We can't all be fucking coders or Uber drivers. And there isn't …
ytc_Ugy79o_Sg…
G
This is really just playing into the fearmongering the companies want.
In realit…
ytc_Ugx4SwdH8…
G
A battle robot deciding to run away does not experience the emotion of fear. It …
ytc_UgxrHwQnq…
Comment
What was the name of that study saying CEOs and managers are more replaceable by ai than anyone else?
youtube
Viral AI Reaction
2024-12-20T17:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyOVJwiCMhzEbgNGVB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxI5JnldBZetCVrPmp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwX6uaTpB68cKlMLtV4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxVKIidV8C0vpoE8Lp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxFsOiBqWELEjipd6p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy6i6dWqfybpYPwBMp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugxf_fipjNqiox_thKh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzI4MeMujss7XX_0-x4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzvxTlbwA4jUfFkcKh4AaABAg","responsibility":"user","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzPqX9svZOqeHgTtWl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"disapproval"}
]