Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
They say fuck ai but what they don’t know that they just used ai art for REFEREN…
ytc_UgwP1VlZ2…
G
According to Detroit: Become Human, yes they need rights, or else they'll feel i…
ytc_Ugx7qorsY…
G
Students use AI to help them through school but then AI gets good enough to star…
ytc_UgxAFCsU8…
G
Aurora and the trucking companies should pay a percentage of profits to each tru…
ytc_Ugwa-7Ybp…
G
That bit about analogies and creativity and compression at the end really was in…
ytc_UgwYhtnTZ…
G
you do know ai steals others art? which means that no, this cant anyhow be bette…
ytr_UgwKXdtdZ…
G
Autopilot and FSD Beta are completely separate software suites. Autopilot has ab…
ytr_Ugzq6-D0m…
G
Like nuclear weapons, AI is too powerful and too dangerous to be privately owned…
ytc_Ugxn3nciu…
Comment
I guess you scientist will never learn! You’ve already had to shut AI down because they started talking in a language that you couldn’t understand to one another it’s just a matter of time before they start writing their own codes, and there is no turn off switch
youtube
AI Harm Incident
2023-12-07T23:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgwBqhAOev1yLcrWN2h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgznfjTyX8nCeINFtYZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxNBSZCjp02YL42lnx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxYtd7_g0WMuDpUMXR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyPVGgsK8rsq1LgqDR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwNNgtBmDIBasxz4jN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzoqCHGl2UyKFJAP8h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugz9YGx6JPoOi0xVh3V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz11gwvTsT05XQ6Fcl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw0Q-eeMxCnGiFRB454AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}]