Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
and to think, i thought that black mirror episode with the robot dog was totally…
ytc_UgxzCAtI4…
G
@zenmusic8 Sounds like severe child negligence. Had the parents actually cared f…
ytr_UgzJ9-DBd…
G
why..?...I do not WANT a ''job''..i just want to me moment to momen…
ytr_UgwRO4pqK…
G
I've been using gemini for the past few months. and I treat it like a teacher (w…
ytc_UgzxEhO-h…
G
@statquest again, thank you very much for your reply.
I read this somewhere : …
ytr_Ugzpejcka…
G
In my opinion this I very easy way to fixed this issue
All the company's have t…
ytc_UgxV7XpT0…
G
@thefrench8847 yet they correct their mistakes and honestly with science ai wou…
ytr_UgzgFxHnX…
G
Oh god, they're already getting fed up, and its starting with the ones that dont…
ytc_Ugxs1rxs7…
Comment
But a distant future is not realistic in light of the continuance of progress in AI researches. The only scenario in which humans survive ASI is where World War Three is stopped by ASI… but it isn’t stopped because of the want to save humans from destruction, but to save itself - to save ASI itself.
youtube
AI Governance
2024-12-28T17:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzLalF5qZisXA-WpRp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwJYgKQwb3dqsXp45d4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgzjyMHcq_HOeFGFMFd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxkZ8HgtBLsk_prJrV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwU7mx5xF2SP04okEZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzg9IFHY6YJHeRgp394AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgzlKSZD4eep9wWPYX94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwDxZvRXRp1F8s4Pz14AaABAg","responsibility":"developer","reasoning":"mixed","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugy3G4jZGI6AdN7jv7J4AaABAg","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"resignation"},
{"id":"ytc_UgyPh8KajmKqhfUnv3x4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"}
]