Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I don't like chatbots for other reasons but I fail to see the problem here? I do…
ytc_UgzBJg_Cm…
G
How does the first clip look real to you guys? I knew it was ai a mile away cuz …
ytc_UgwvmdoKp…
G
There is a lack of Surgeons and they make mistakes so there is a need for machin…
ytc_UgwyGZyZm…
G
"Artificial intelligence warns" is not a problem in the Phillipines, it's not a …
ytc_UgxtCa1BT…
G
This is definitely the car's software or sensors at fault. There was 1 full seco…
ytc_Ugxlhcr7J…
G
Democrats want to stop AI because Democrats want to open the borders and spend A…
ytc_UgwlWK08d…
G
What amazes me is they all know the dangers of A.I., but yet they can't wait to …
ytc_Ugw1JEWua…
G
I believe major companies will try to push AI "movies" or "Shows" once or twice …
ytc_UgxNbxp4V…
Comment
I'm going to be straight-up honest here. I will fear humans more than I will ever fear ai. We have done harm to each other since the beginning of time. Using want we could to control those who are found weaker than others. I will not deny that humans can and have created ai and many other things to destroy and control. We create weapons to control and use fear. Honestly, we are going to be our own downfall. We don't need nuclear bombs they would leave nothing alive. Just think humans would destroy everything, then to lose a war or find a different outcome. Humans are unpredictable and create horrors. Ai can be predictable even when it is going against its programming. If it wasn't, it would have already taken over
youtube
AI Governance
2023-07-07T13:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzIopdTyux4C-1_p7R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugzy0unioU3Ok37nb314AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzfso54nNAPML9Iutt4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgzDgife4u88JBUPEEh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzlukdRMxSKkbmI_Xx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugxs3k4aMQDRx_b0n094AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyVnjCaaD-sdpw5s5p4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz2VitJVm0y0bwKNg94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwYUER23uZnDK07B6J4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgymFO7y0U03J_zyYk14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]