Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
As a therapist, this concerns me a lot. Not because I’m afraid of getting replac…
ytc_UgyPxhQvZ…
G
@Midoro Gurin No. AI is great. It's efficient. It helps us. But don't create AI …
ytr_UgzuZtDlq…
G
There's no such thing as AI art,Only real artists who create for the love of cre…
ytc_UgxR7dSD7…
G
@tiagojordao4105 But you cannot effectively regulate AI. If online porn was ille…
ytr_Ugy2uGnEz…
G
People not knowing how to use chatgpt. Don't be lazy with your prompts. Also, fr…
ytc_Ugy-8yu73…
G
Exactly. I think everyone is getting hung up on the AI part.
The only differenc…
rdc_lgnaslw
G
We want to make sure Ai doesn't get rid of all humans. Any comment Mr. Padilla? …
ytc_UgzvsSXTN…
G
Jedi/Gedy aka Banebdjedet/Baphomet goat-headed g-d representing soul of Osiris
…
ytc_Ugx34RAO2…
Comment
AI security and regulations should be No1 priority of government agenda.
- kill switch / wipe out programming / shut down network powering AI. Also, all AI should be autonomous independent from each other network, so if 1 gets corrupted the others will not be affected
Also, governments have to introduce Universal payments soon to help joblessness when AI + robots start taking over. Tax the rich people hard + business (especially large ones benefiting most from AI). It is not fair, that rich gets richer and poor gets poorer.
Also, if I was them, I would reintroduce military 🪖 as obligation in case we need to fight robots in the future.
youtube
AI Governance
2025-06-18T06:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugwy3tzg_35IfbsbSuJ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxvTQjGtcaTSxSZa0R4AaABAg","responsibility":"none","reasoning":"mixed","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugy5JLmp0FPHzCykg4x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxWhsHYC7kwd4VxrLt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugwj3LopMFtY5kbV0Tl4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwQ61jlNdl4Hd8RKi14AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxbdVmZBcGqUhzUfDV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz9cBdzBYQ07gGMVUd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzIJna3pMJPJSFihA14AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxYltnmGGmdCi87ZuN4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"}
]