Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Saying you are an artist when using AI is like saying that you’re a chef when yo…
ytc_UgxikR6SL…
G
The new iStock ai generator creates images based on its library and pays the art…
ytc_UgyZS4TnH…
G
We're not in a simulation orhwrwise humans wouldn't be creating AI now. Why woul…
ytc_UgwNWnvwa…
G
i copy righted my face, if you want to use the facial recognition on me you need…
ytc_Ugxz8VfJL…
G
Yeah I don’t even like how people call it “AI art”. They are mangled images bein…
ytr_Ugzz8M-G9…
G
People will continue to walk or run in front of moving cars and die daily. None …
ytr_Ugyyf3r8R…
G
As a computer scientist myself, it's easy to get swept up in the excitement over…
ytc_UgzzafP_5…
G
There are now two separate incidents that I know of, that went seriously bad. On…
ytc_Ugw3k8YnR…
Comment
Foolish to be pessimistic? Not so sure I agree with that considering why AI would require humans in 10-20 years. How you going to give control of everything to something millions of time smarter than the smartest human then think your going to maintain control? That’s foolish.
youtube
AI Governance
2023-06-08T15:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzgG1iKqs9H8sGu_W94AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"sadness"},
{"id":"ytc_UgwNtdtZUn9tw39i1sp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx8FEreIAtjvKFDgNp4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugxe0m2_JLNPAGQIaXh4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugx4A52-7I-ZjdpVO5p4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy1qNwfkYwXtH2uryd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz-Wa6qZj8g0QCSPhV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyC_tFXgqbfa1rfDKZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwgjL6M6v1WK1Fd-Kt4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy9iJV0n57zD2BGrBN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]