Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yet for all these interviews, all the speculations and advances, the competition…
ytc_UgzjVcSAf…
G
Once I was talking to Google‘s gemini ai and it was all going well until it just…
ytc_UgwOdjkCA…
G
I figured people would be ok with AI art because it makes art more inclusive bec…
ytc_UgxjaohrW…
G
the issue is that ai used to be unable to create an image of a dog. now its maki…
ytc_UgzGtLS5N…
G
I gave GPT-5 a chance to respond:
Why is Neil deGrasse Tyson overrated?
ChatGPT…
ytc_UgwzANbV3…
G
See it is not just about ai in job market its about longer term impact. What has…
ytc_UgwORSwtO…
G
If it gets worse than inhaling a.i. loaded self assembling nanobots which cross …
ytc_Ugz3KqlS6…
G
The "statistical" explanation is a big over-simplification and doesn't account f…
ytr_UgxnduCsu…
Comment
Ok, but it’s not free. Why do people think that replacing humans is free. This stuff costs $100n of dollars. Here is my prediction. Some will move to AI and robots, it will cost waaayyyy more then they expected. Things will break, it will get things wrong, humans will lose trust of it, and bring many humans back to supervise the AI and robots.
youtube
AI Governance
2026-02-04T04:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugyt9HJCunb5awHFWl54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugwzt5y1FoV0JO0n_eJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzGziWvLciFPdqJVKN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx4D4OE4cRuv8H9ged4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugz_FQUbwQI9CVW_6YN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxvfdH3IDsVd9W06Ot4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugwh5_LPYr2AlQ5XIYx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyTqthXtb5pqttUxbJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyerWD-5t1cmMPRPLB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzRwb5A12VnKr-cikx4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"mixed"}
]