Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
the only regulation that will make sense for ai, if manufacturing plants replace…
ytc_Ugxw02dGH…
G
I drive a 44T articulated lorry in the UK which has notoriously terrible roads a…
ytc_UgwFeYT3W…
G
sounds like most modern music. i hope AI takes over mainstream music so we can h…
ytc_UgwAlUDo_…
G
Thank you for your observation! Sophia's movements may remind you of the Chucky …
ytr_UgwppvNz9…
G
AI as all the farmer disruption technologies comes with a fear of employ destruc…
ytc_Ugx3zzht0…
G
14:43 Is it safe ... to give ai access to weapons... bruh if it was safe, my gue…
ytc_Ugzt4WBjV…
G
Thank you for pointing out the correct spelling of "Sophia" with a 'P-H'! In the…
ytr_UgzTmccF8…
G
chercher des poux a des criminels ? pour toi, il y a le bon et le mauvais crimin…
ytr_UgyQ8oJ2u…
Comment
Like i know this will be naive. But Still it is not impossible. Why don't we advicate for a global stop in development of AI. And people might say, that will stop many "good" developments and improvements thats gonna be made by AI. But i think it will be better to not have AI, postpone these developments by allowing humans to eventually make them in the longer future, and hence 100% save humanity, and keeping us on top of the food chain. I think more people need to advicate and use our democratic societies to put STRONG regulation and then global relations to make the rest of the world to stop. Maybe war will be necessary. In my opinion, it is much better to force humans to not have super intelligance, but humans remain the ones with control. That will save our long future. This is actually more serious than what anyone thinks. If we are not on top of the food chain, we are no longer in control. There is no escape from that.
youtube
AI Governance
2025-09-07T20:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyrZNmzEHlifvkw7TN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy5s9nh01fPzeAFjWR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxRTmcAcXOomnGyT3J4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxUixXMUZODh5zHdSR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzSGj-RphEmBQyJMxZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwAdv6CDZADx_rbXMx4AaABAg","responsibility":"none","reasoning":"none","policy":"none","emotion":"approval"},
{"id":"ytc_UgxIILrzmckewwyNDMJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy9jqGoGrCR3XomqP54AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxXbxJUnnW1-k8f-nF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy_NySsFsoqlt9-jAN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]