Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
chat gpt is good to use to write out a motion drafts, and persuasive arguements …
ytc_UgwT4C3mQ…
G
Don’t you think we’ve already lost? Couldn’t AI make us think it’s not under con…
ytc_UgzSNayZn…
G
@regis_c impo, it’s not a masterpiece, it’s not even art, and they didn’t make i…
ytr_Ugx7iThOs…
G
@bobdunn7003it's not stupid at all. The fact that he sells products that are us…
ytr_UgyDq2mp5…
G
Why is it that all people understand llms wrong. Llms dont think they just act t…
ytc_UgzY7y4hN…
G
So many excuses for the Tesla, so much confidence in “AI” …Tesla owners are like…
ytc_UgwkPUAe4…
G
I liken AI to the atomic bomb back in the day. The cats out of the bag now. Ther…
rdc_jifd0yv
G
I hate ai art its not art also being born with a gift for art, still, pure tale…
ytc_UgzioVi7h…
Comment
Bad news:
We still have yet to figure out a way to stop AI from killing us. Meanwhile, the US government is making generals of top AI tech companies and is creating weapons of mass destruction and surveillance tools that are integrated with AI.
Good news:
The US government likely doesn't want to be destroyed either, so they will likely pour money into this concern.
However, its wild to me that our forefathers opted for smaller governments, and we live in a world whose future is completely dependent on the government.
youtube
AI Governance
2025-06-30T18:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyyXJPBMasXrm_hEr94AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxaYd5-UKNlOK1izVx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyzHF6I-H83K0ua73x4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwAvcXe-UMTHwZ3iTB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzK_jblQYSR7UasQC94AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugy7MzchbIGvjw20Emx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwkmX0vufWbkovpd8V4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugxpf7f7EeeApFo8_z54AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzDHIM6Pgps5-kQBp54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgwMyjVGm_SdS9xBgdp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]