Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If there’s one thing we need less in life is these obnoxious Meta staff engineer…
rdc_oi0g6d9
G
because its fucking awesome, soon he wont have a job, the ai systems will start …
ytr_UgxlSj8ut…
G
These algorithms are only as good as the information that it’s given…,once ur in…
ytr_Ugx64bzFc…
G
Why do you want AI to take all the jobs Elon? You want docile slaves, huh?…
ytc_UgzyYjcMN…
G
Our county roads were overgrown with grass hi g her than my knees. AI didn't do…
ytc_Ugz0dBmA_…
G
As one of Jehovah's Witnesses, I have no issues with Ai art obsoleting tradition…
ytc_UgyNt3ljS…
G
Bilateral agreements were common in the 1990s and early 2000s, but less so these…
rdc_e2wh067
G
@angamaitesangahyando685 art IS already accessible to everyone without AI, and c…
ytr_UgyaSEtHq…
Comment
The closing comments of this video reminded me of just how excited I am for the AI/LLM bubble to burst and for Zachary to do an autopsy on its death. I feel as though, at this point in time, the horrifics of AI use are outnumbering the ACTUALLY applicable and realistic benefits of them, not even considering the sustainability aspects (which mind you, are abysmal, but big companies rarely care for that) but the sheer number of deaths and human repercussions happening are bound to deter people away from AI, enough to cause mass panic if things keep escalating as they are. Also, combine AI encroachment with the increase of surveillance and sketchy data gathering schemes? We're cooked.
youtube
2026-02-09T22:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | mixed |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugxaud33fd9RwnJmL3h4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgwQFzVliM1x7Fqc6f94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgyrLo5pLZlbBKRl7Cl4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"},{"id":"ytc_UgzAJtsGTi3K-_R2Vyh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgxT2OAJb0qSjKXqq5N4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgwzUnq-hHFViCZW49p4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},{"id":"ytc_Ugz3dYagCJmWWF8hCIN4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"outrage"},{"id":"ytc_UgwmVHnB1CWdWrYbIbF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},{"id":"ytc_Ugx41js48c5Fremh0zp4AaABAg","responsibility":"company","reasoning":"mixed","policy":"ban","emotion":"outrage"},{"id":"ytc_UgzPzEK9qE5rfwTcL3R4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}]