Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Tesla fatalities were caused by human error not the driver in supervised FSD T…
ytc_UgyOX8e2U…
G
April 2025 half marathon in Beijing a 21 humanoid robots participated. The winni…
ytc_UgyGqT_g-…
G
Obvious answer: Because Trump loves AI, and has done everything possible to prev…
rdc_nui4aoh
G
I am not afraid of AI because everyday I am learning tools those are using AI.…
ytc_UgzF9wCPC…
G
EU isn’t so far in the AI game not because of regulation but mostly because of h…
ytc_Ugxeb5x-G…
G
This AI shit is getting out of hand lol and some people actually think he fought…
ytc_UgzVHjQI1…
G
This interviewer is just too much man. But so nice to see the old school AI beli…
ytc_Ugyiw94oN…
G
Why did Tech Ceo Erin Valenti say that she was in a 'thought experiment' 5 days …
ytc_UgxgPraKR…
Comment
Remember these “AIs” aren’t really that intelligent yet. What you’re seeing with LLMs are really sort of a trick in that what it is actually doing is just predicting what it should say next, it’s not thinking or arriving at an answer based on what it knows about the question. It just predicts what would be best to say. It’s a little smoke and mirrors when it comes down to it.
youtube
AI Moral Status
2025-06-28T19:1…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxC4QC9kSCjTdXS8V14AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyeAgTEPWWntZ0kHmd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyYJxBo3-pFQiDapep4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyqoQtGYUtXKY2SZcp4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxJMpAwJJpZmiv7TDl4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzyPuMwT8oTWJUE2Up4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxMZFJd9STscZl5A3J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzFa5Cb_eRLsVMFzod4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugw3F4xtg6A8hgCIe2V4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyIbZ5SvB57cjAN60x4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]