Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Grateful I have my own business and there is no way AI can do it.…
ytc_UgxqQ7Kz_…
G
The principled reason that AI can never be conscious is the question.
Right now…
ytc_Ugy8EVyq9…
G
Stephen Fry can't even wrap his head around trans people existing, how is he the…
ytc_Ugx6M_WEJ…
G
Us writers and artists are fucking doomed if people start preferring AI generate…
ytc_Ugx21vvqg…
G
What these AI company needs to do is allow artists to monetize their AI data if …
ytc_UgwIqjYiV…
G
We literally can't even prove that other humans are conscious beings. How would …
ytc_Ugwyql58q…
G
Beginning of the end of HUMANITY. People are laughing now...but think about the …
ytc_Ugx9Idohx…
G
Ong making me think I’m not real and shi bro it’s scary I don’t know to believe …
ytr_UgxE9uTRU…
Comment
So here are examples of exactly what this video was pointing out. Eagle Eye, Chappie, Free Guy (not really about violent AI, just AI learning after its developer gave it the proper script to follow). I get that movies are written out to be the way they are but honestly, as long as fail safes are put in and the developer knows what it's doing and these programs stay under the human thumb, I don't see any of this being an issue. I rather see a touchscreen order station replace humans at fast food places but that's another topic all to itself. I wouldn't worry so much about this if I'm going to be honest, plus this is coming from someone with 15+ years in the tech sector.
youtube
AI Governance
2023-07-07T02:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzSilu4l-4lvyzD2wN4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgwMCZGDcf6hyBdbgNR4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgwuWBXNfsVUlfd7FqR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzR-z6jrKVAZOdZ6J14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgxKBUOMCzgbsko_7rR4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy02fx5bpidwCPPAFF4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgwDZfWKfZ4NQuG601t4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugyx-qMxCRfe1LmWPr14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxhHa_MnYW0Lzgd8ut4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyV4QzQZws8sI4Cgmt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}
]