Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI can be an amazing thing for humanity in terms of equalizing all of us. Will w…
ytc_Ugyl_Mt71…
G
This is always going to be a problem with AI chatbots, and that's why they reall…
ytc_UgxqloSld…
G
Haha! It seems like Sophia might need a break after all that wisdom sharing and …
ytr_UgyhQI0dz…
G
Crazy how AI just kinda popped up out of nowhere.
But AGI is the real threat…
ytc_UgzcFAy6t…
G
Id like one , take my money and shut ..up .
Id like brown eyes , brown hair , an…
ytc_UgwFwTY8e…
G
Robots have no moral compass, they cannot feel pain nor have a conscience. We hu…
ytc_UgzdKERlB…
G
I want an AI robot thats smoking hot makes cocaine and can get and do liver tran…
ytc_Ugy8Vr9LX…
G
AI will never replace what I can do, It can even do the simplest of things with …
ytc_UgyIL88f7…
Comment
The Value Alignment problem is the holy grail.
Aligning AI with Human values, but then you've got to decide on what "human values" actually means, and it can't be half arsed like the usual human politician types half arse it all the time.
youtube
AI Governance
2023-07-07T09:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgySKW176UPvripbH5x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyjY2dXlFoeIhNacMR4AaABAg","responsibility":"developer","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxaDIhRkSCKxtWw5nB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxRbWRzLCpjC675Vs94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyFrtnsGVlYL77Hf-B4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx5mTfOeUxvWBJMBP14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwgWQElQQR_t2y_Uo14AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwlnSezJ9FGb_BLqYh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzPX3-Gh9zoltjM77V4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgycQu9Gv_dnxZkCA4N4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"}
]