Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Maybe AI can be the employee, the employer and the customer. And leave the rest …
ytc_UgxSt522j…
G
He's not entirely wrong but I can see how that might sound like a non-answer
A …
ytr_Ugypak9sR…
G
Yo! I don't think algorithm (at least YouTube's) work very well. I keep dislikin…
ytc_UgwgRaLy0…
G
We should be grateful for God and ai is the decrease of humans potential 🥊 Jes…
ytc_UgzzpgOLI…
G
Its look like robot picture that time also they are harmful for people that time…
ytc_Ugxe1TzVY…
G
My immediate thought back when AI-art started getting popular was "what a cool t…
ytr_Ugx6eolNI…
G
Because the US is further ahead in this technology, so he's saying that if the l…
ytr_UgzIQDFbI…
G
is it just me or would it be funny to give chatgpt a Ben Shapiro AI voice?…
ytc_Ugwn0Jf_E…
Comment
One thing I still don’t get about AI: motive. If we ever build AI that’s vastly more intelligent and it’s not trained to love or hate humans, what then? Where does its drive come from if there is no seed planted by a human. What gives it the “impetus” to act autonomously in the first place? Where do we cross that line moving from advanced tool to "being".
youtube
AI Governance
2025-09-12T20:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_Ugz_GgRofhaebte4akl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw1UuIXVZML5omuIX54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwVLRFooUKb7gsXvKd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwVlJ3LD6I0Yy2fN1l4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwFV-kKQNEBZSsxp414AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzKb9fr7dDumrsWB5h4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxD1-suZARD1m15JIp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzRFUviriV5sIft83B4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzh64BHvT3QEWH4coJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw0zpLccETyGnLz-G94AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}]