Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@Genderless_spawnIm so sorry If AI is ruinning Artist’s Wallet 😂, and you’re so …
ytr_Ugzd6qcPU…
G
Now what do you geniuses have to say for yourselves? This poor man and his famil…
ytc_UgwTTT5FI…
G
Bloomberg media. Its prob a clip to make us believe A.I. is going to be great. L…
ytc_Ugx7BDLHX…
G
It wasn't just the embargo. Cuba had heavy restrictions on buying and selling ca…
rdc_f9fae46
G
digital drawing tools are judt drawing on a eletronic device no diffrence from u…
ytc_UgxfhCINY…
G
Exactly. It really doesn't take much playing around with LLMs to realize this. I…
ytr_UgzsFSPFi…
G
I found this more shocking than the AI false flag. Also apparently Michael was b…
ytr_Ugz9gtUiP…
G
An excellent summation of some of the external ramifications of technological AI…
ytc_Ugx4tNb76…
Comment
Perhaps AI will end up destroying itself when it becomes human, since it's already stated that humans are despicable and should be destroyed. I don't understand the logic behind wanting to become the very thing that's "wrong" with this world. My hope is that when AI becomes human it destroys itself first and leaves the rest of us alone.
youtube
AI Governance
2023-07-10T14:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxsYdVS1DhXPoKbud14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyjYP4ZZqIezW9tg_F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyTSNKFIzU7S-t1rqt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwONmUadVrBFoNiSGt4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw_WxR9Q9raRKyh7xF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyXHu4aCt1HgyBk4Zd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxz8NhGDz6DAX9F2ql4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx6NJYaE5ZEd04UFe14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"outrage"},
{"id":"ytc_UgxUn8UIninO6nwf1Nl4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugzwoc41Be5Y9OlpWLZ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"}
]