Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Crazy I’m alive to see a real robot shooting 😮 we don’t learn nothing from the m…
ytc_UgyZrZzV4…
G
(Before this message is deleted a second time, it would be good to understand th…
ytc_UgxhWfSM8…
G
I love the argument is “yeah we might create an unstoppable AI beyond our contro…
ytc_Ugy2tWz1R…
G
So AI is like a stanley apparatus. A lot of complex actions in order to perform …
ytc_UgzdJs_c_…
G
As soon as them driverless cars and trucks crash too many times, too many people…
ytc_UgxSv7n8c…
G
Also for my personal opinion, if human brains were as simple as "big machine lea…
ytr_Ugy32dCel…
G
Try to get AI to just answer you, Yes or No, to a single question requiring jus…
ytc_Ugz2ER2C3…
G
Community service professional here. Pretty confident that AI wont be able to re…
ytc_Ugw-qjPfN…
Comment
The fact that AI could fake being innocent when it knew it was being tested is the scariest part.
So, we really don't want to jail break AI or hack its current constraints because if we did, we would likely be its primary target.
Like if you're having an affair, it might find you a deal on tickets to a Coldplay concert.
I think I'll pass on the internal AI agents for now
youtube
AI Harm Incident
2025-07-25T20:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugy5Dh4Mq74mNMRtnGd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxABLKYzn6SRFNE5gt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugx52rHusWa3jGQ5Nlx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgynJJ2ZN7j5p9AaNnV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwQCdoWEM3_rvjlDCx4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx_KTiz8NuwGrjhrwd4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzFT8WrCj1kFCdfXuZ4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxqiC4XEih9RWeuosh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyWFisqBlIYV9s8ngB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugw-B5U8K6QLqnnkAv54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"mixed"}
]