Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'm afraid it's too late to unplug. AI created Bitcoin to fund humans in creatin…
ytr_Ugxu5nP1v…
G
@AlaKur-k7z I'm confused, if you are a master level cook, then wouldn't that me…
ytr_UgwPtBqp7…
G
They aren't going to stop until it's too late. A.I. will be the end of humanity …
ytc_Ugyow_BaJ…
G
"Lets let the ai have access to the Internet!"
"Why is she saying the n word-"…
ytc_UgzGrW0GR…
G
@melferrellyt Yeah.. that's literally not new for any algorithimic thing. Youtu…
ytr_Ugw5S4-bL…
G
Collectivism is more dangerous than AI. More dangerous than collectivism is the…
ytc_UgyuXqMLA…
G
The reason for AI to have it all is monitoring!
But... Consider the connection…
ytc_UgySOpZEL…
G
"fck ai! Fck ai" and then they redraw this cool ai generated art...
Great pictur…
ytc_Ugwjd6_Pf…
Comment
I'm iterating on a potential social strategy of getting from "A to Z" of convincing society how to take AI seriously, and how to make everyone bat sht scared of it. It's like a starting point. At some point 'll share it.
I'm not sure why I haven't heard anything too similar to what I'm coming up with talked about yet anywhere, are people just not doing this or anything?...
Doing shows and talking about it great, it's definitely part of the plan... but there should be a lot more strategy going into this if we are serious about saving humanity.
youtube
AI Governance
2025-08-23T09:1…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugwim9XQC9rU_cnMzhN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyO6Ytj4-Ipljm9bO54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzBUp9cqxp-Q-SKku14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz2EiMn64SuvupH3-V4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy_vIeMWyPOX-y2BsR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyOOA_hESTcxGbeUjt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy1T_34YaiGCD0NUaF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz5YYrTA7lZg1omUoZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzbvp-J4ZvzkrKuSpl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxRl34qJ6wXvrpB-Ax4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]