Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@ he took it out you lost features in the car after vision only. Vision only is …
ytr_UgxgMowDm…
G
The problem with AI has never been the _actual AI,_ but has always been the _hum…
ytr_UgzJVOa3E…
G
trully scary scneario. on the one hand we got the elites trying to take over, co…
ytc_UgxDFS5o4…
G
Unplug now!!! The ai data centers are horrifically toxic to our people and plane…
ytc_Ugz5voaIE…
G
bollocks. trained on mid-2024-publically-available level of info that is a wikip…
rdc_mck11et
G
@oanhienlong7264 It will be hard for an AI to create EXACTLY what you want, but…
ytr_Ugyh9KmwC…
G
1:19:20 Oh boy, the same governments investing billions into advanced AI weaponr…
ytc_UgyPG2vDX…
G
Now I m worry free from selfish human n worry free about nature n animal .. AG…
ytc_UgxpJd4IJ…
Comment
13:34 Turing said AI would think just not like us. He didn’t say anything about feeling. One flaw in this rosy scenario is that human level AI will meekly pursue what their owners desire. And that AI will not count empathy as one of their feelings.
youtube
Viral AI Reaction
2025-11-25T10:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw2Hr1SZ5hlHOOBgGt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyquNJpoLm7foICspF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxb1E6qFy-ddup5vml4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyEF_7PPchcllkcNfx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwhw59DBwkQP9GmKc94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy1rV24BGzDZ7Lj36d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyTHDkTA45SQe34gQV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgyOumBWb1xSyCVXRFF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy_nOb-fj_UnZqlRch4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxvaukqd-6F_FNVpaB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]