Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Listening to you probing an ai when you got your answer from the start is the mo…
ytc_UgwQTnA6j…
G
ai will be better in everything even in craftmansship in combination with robots…
ytc_Ugy5xrX0W…
G
What about the autonomous driving capabilities of other car manufacturers? How m…
ytc_UgwMacZoQ…
G
Whenever I’ve tried to use ai art, it’s never what I want. Only a human can crea…
ytc_UgzFbBzIy…
G
*humanity afraid of AI causing our extinction*
Tech company: “let’s give them al…
ytc_Ugww3_pmT…
G
Wouldn't you think pushing this AI is just going to make humanity even stuipeder…
ytc_Ugw9p7WG6…
G
its not a joke of a vehicle, no other car even has self driving, just because it…
ytr_UgyL1ZGzk…
G
I've heard that facial recognition technology is not as accurate or is effective…
ytc_Ugx320go3…
Comment
Well, the problem is: it’s a race for everything. So, safety isn’t in the equation. Like a movie that doesn’t end well. So, are the smart people really that smart in the end? No. Humans aren’t smart compared to future Ai anyway. Many “experts” true PDoom(no public statements) are vastly higher.
youtube
AI Governance
2025-10-19T11:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy0XbWExcw1UATlrCZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxdd5LhOa4BqgfiVYJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwXhVfzoMiIICL_VrJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugxn71dq8OryH5hEXGx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwPVQ3YuLhG9kEyktR4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxVF3BV6PccJUZLrGh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxCuyjxFjBVQedL6wB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwnSbFOhb6ZTUZkOSF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwt6IhKX3j2vNFEkI54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwlkvU-6T5Cs7xrl5d4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"}
]