Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The problem with 'safety researchers' is that they're all decels who would rathe…
rdc_m9i2w8p
G
Me:you generated an a.i art just cuz your way too embarrassed to show your own a…
ytc_UgyaRte7V…
G
I do used ai art but just for fun. I don't make money with it because I know it'…
ytc_UgxuE_mdy…
G
1:25:50 i think the main threat to human happiness is humans. genuinely. its our…
ytc_Ugw4KUIlS…
G
So that’s the line. AI generating images of children. Not the fact that Trump…
rdc_nzh1ooq
G
too much BS and didnt even say what 5 jobs will remain! just another AI peddler.…
ytc_UgxjtKguB…
G
AI IS GOING TO TAKE YOUR JOB..you have to believe that and get ready! I have jus…
ytc_UgxNpfilv…
G
I agree 100% people don't know how AI can learn and teach itself like those tw…
ytc_UgzL3wHTk…
Comment
The truth of what can actually happen is even more terrifying than this episode portrayed. The fact is it will not take long to cripple us (seconds), and there will be nowhere to hide (satallites), most will not be able to run (AI is in evrything automated/electronic, or will be soon), and many if not most can die in the first seconds. Pacemakers and other electronic health devices, planes, trains, and automobiles🤣, even boats, all types of devises; weaponized and or turned off instantly. However, that can only happen if people make mistakes and are fallible.
youtube
AI Governance
2023-07-07T05:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgypHQZ-V1uGXqxD3tN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzJOvyylzbh6aklK0d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzdW3cFNTOIkWk5ish4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxeIYnBxrFyaxwG6-94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxgB6iZISeffI56bx94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxoQyX5o4oolprCv7N4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz6-M8Z51Lmgki0L6l4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxsWSdlIxdqLur8BqN4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy0tHKF748SnGMR5wB4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgzxPMo5NLOe6EBKHuh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}
]