Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Elon has a Super Computer and a personal AI. Bad boy bad boy! Whatcha y…
ytc_UgzgKRwd_…
G
It should require companies that want to build large artificial intelligence cen…
ytc_UgyGyCTRA…
G
I'm in my fifties and just want AI slowed down a bit so that I'm dead before we …
ytc_UgwYb2kSO…
G
i think we should have a direct democracy in place to represent the people of th…
ytc_UgygyuqHq…
G
My question is the AI inteligence now can revolver that what a human being can ,…
ytc_Ugzz4gwfj…
G
i think the assigning human attributes explains what were doing now. ai is reall…
ytc_UgyFqlJLb…
G
This actually made me smile so hard 😭 seeing the diversity and creativity in eac…
ytc_UgwPh5W4F…
G
Could we make our own Art Station "Art House" Make it no AI allowed and no data …
ytc_Ugy1D9gh8…
Comment
This is stupid when the entire world already know that A.I robot will eventually take us over but then why will people she invent and make it better for them to kill us? There's even a movie called "Martix" is just abt A.I told over us... If people still inventing it and make it stronger than in 2050 human will regret what
they/we did now. Just like in world war 1,2 German should never attack Russia both world war but as always human never learn from there past / mistake. Sign~~ hopefully i don't have to see that day in my life
youtube
AI Governance
2024-10-28T03:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzRGitGouB2qeddoIp4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugw-VqcSgqWZ_D5qDXN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugyl8EozFvC3TxeYk6J4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxcEKqx46SoxEBUZtB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzv6wvZgcwNbnyCT054AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugx3MG22wNHGaPUr6iF4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugx6Rc7KOL6BGJG27RR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzMl2wP9Z_OfqMIQcp4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzVN8kFi3vJTfO0dQV4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyB60w9ldb7AV_QBy14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"}
]