Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
There's no 'intelligence' at all in AI.
Thinking that AI models can think like h…
ytc_Ugy_v7KHA…
G
These are problems with the AI's training model, not the AI itself. No company w…
ytc_UgxCnkHlQ…
G
I see you own a Tesla! This means that the Tesla Robotaxi would adjust to your p…
ytc_Ugw0NqzS2…
G
I can't wait until they realize nobody can afford to buy their gadgets, AI and r…
ytc_UgzGOQNLN…
G
She couldn't control the trip but she could stop the car. Different actions.
I…
ytr_UgyYMpwFC…
G
Hey there! It seems like the interaction with the AI in the video might have sur…
ytr_UgyAXIUH2…
G
Climate collapse is more imminent and likely than AI threats.
AI is much more p…
rdc_kqugo1x
G
He really didn’t answer the central question on everyone’s minds. How many jobs …
ytc_UgygK2QEb…
Comment
It's going to be the other way around... The AI going to make us work for them... Just like what the rich people did to us it's going to do the same thing...😅... If you are going to suffer let's make it quick.... Please somebody invent that super intelligence 😂
youtube
AI Governance
2025-09-22T20:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzzTlO0YW0irKjnbQl4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzWVpZgnYIb94_JAdx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzUubDtKZFqNOAkMhZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxETznNaZ-84NRqTWN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwRoTUiQb_0ropdmcB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugy6QdyXvA4RmtgOkP14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgydGpNchjKK8E8vxbB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzlcGhmO6AAasoj7ZB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgwJVLDysfZGWM2s5Qd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxTZD8_awoqptl4zc14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}
]