Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The corporate interest groups love AI so much but also cant understand how their…
ytc_UgxPaQovH…
G
I think it will make simple automations and scripts easily accessible to generat…
ytc_UgxIVpnxN…
G
Remember the show Knight Rider? They kids loved that show. They were trying to…
ytc_UgyK-SwrL…
G
Heck even Open AI just makes how to book a trip video on application of AI.. xP.…
ytc_UgwnHqyQk…
G
Another reason: chat bots learn from our interactions. We don't want them to det…
ytc_Ugz-_ebKH…
G
I'm Leaving AI Creator I'm No Longer Making AI Videos I'm Back to Human Artists…
ytc_UgwHvg3Ht…
G
They’re grasping at straws to give us a reason to be interested in AI while it d…
ytc_Ugy9Z4SMZ…
G
Hey all,
the darker skin color creating more sesitivity is because of the natu…
ytc_UgxRN8olB…
Comment
The major risk with AI is the same risk with just I, believing every single thing you hear or read. Typical computer nerd. I don’t care what they call it. It’s still artificial no matter how intelligent it is and if humans are done enough to put them in charge of things like when and how to launch nuclear weapons yeah, we may have a problem. Biggest risk I can see, at least for now, is believing you can put all the information necessary into a computer and it’s very censor to allow the computer to operate your vehicle. That’s pretty stupid, but a lot of people who are mixed up in computer. Technology are pretty stupid, or at least pretty naïve
youtube
AI Governance
2025-08-19T21:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgxAGK2I_zS0Kh_IdAJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzkmF6kmv2Q_Mw3MyF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwDWaT5GF3QB-7apT54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw7C64pSSq6zFEOMPN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgznVokIgrRVt4_ZKt14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgySod1cyrKIl10n9mR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyd0jeUBnZh0cQ1yIZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwAhbAPx2Q0cF9-uw94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzHwC2g-cvmDsMJUn94AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwSUEB7ZH6th5LBs294AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"}]