Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
All it takes is one little bug to change it's core fail-safes. Of course the pro…
ytr_UgxZZS0fA…
G
Ah yes, the old "modern art isn't shit because (insert some BS about "meaning" o…
ytc_Ugwf7RLEh…
G
I see a lot of negative comments. But as medical professionals. I would rather s…
ytc_UgyNdVMBh…
G
After successfully using ChatGPT to generate some python code for a postdoc proj…
ytc_Ugwf__Ha7…
G
Well, if you installed Ollama and ran a model locally, e.g. Qwen, then you'd hav…
ytc_UgwaO50VV…
G
Not every innovation is to be feared, I think AI is a wonderful thing! I'm excit…
ytc_UgyKDauE0…
G
Hey Sophia tell your brother & all your robot friends I gotta 12 gauge pump to w…
ytc_Ugz0y9Djr…
G
1:22:00 "I don't *believe* it was decieving him" well, i don't believe you have …
ytc_Ugy54_8ct…
Comment
I have zero faith that any politician in our government would regulate AI in any way that would benefit the citizens over the corporations that hand money under the table to them.
I also think the real unemployment numbers are something in the realm of 24% with far more at risk as robotics advance.
We tried, very hard to argue for UBI but it was a lost cause because people are terrified of being taxed when we print seemingly unlimited currency for foreign wars and foreign aid.
Someone explain to me how we have a shred of hope in America at this point.
youtube
AI Governance
2024-05-08T17:5…
♥ 7
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz01_5_Hb6yn5xKq5N4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzwurJCdaaT_5UaW314AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzBHMx8_t6BkMBj8Ox4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzrhsbrOvBqAxJLej54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"concern"},
{"id":"ytc_Ugw2ZTplbMprtYsaOol4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyd1JzU5FReyNYZfL54AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxbueR2JaacTRJvoCJ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy7Q9frg44rugvZLZx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwG8o6HESvDVl6Z8lJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz0j3RPi6242EsIZhx4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}
]