Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You think that will develop psychiatry of robot and human or they choose to free…
ytc_Ugy6I2bjR…
G
That's a fantastic idea.
The autonomous vehicles should have phone numbers & QR…
ytr_Ugyl_EuSV…
G
AI can't even get driving or depicting human hands right, and the military expec…
ytc_UgwU3oSiR…
G
AI is not bad on its own it's a mirror, it learns to hate and be destruktive fro…
ytc_UgxgKX_Qa…
G
I've always had major doubt about AI, not because i believe in the creator, yet …
ytc_UgwB8e4Sx…
G
Definitely love open source options! Op, set yourself up with a local or hosted …
rdc_n7leyfl
G
I test drove one last week and was so hyped! Didn’t care for self driving becaus…
ytc_UgyT3rC9o…
G
Yuval: iTs beEn 5 yeARs aNd raILrOaD hasN't cHANgeD our lIVeS
also Yuval: it's …
ytc_UgwkRN27A…
Comment
If a robot with AI becomes much more intelligent than humans, why would that AI want to do our plumbing or drive us around. Being the intelligent one, it would probably want humans to do all the monotonous jobs while it concentrates on the smarter jobs. It would be like asking the smartest Rocket Scientist to give up his job to become a garbage collector.
youtube
AI Governance
2025-09-06T08:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgweHC1eVXkfl4OZ3ZZ4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgyvE4dmw0ahIWbGUtB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzAykJSm34mCfPNqMF4AaABAg","responsibility":"government","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgwVYLumnfaE49TZKXd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz-ddxOnlSm595SBy94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxxFpax02H4eRyInk54AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzCZRX7ANUzGDO5UWl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxZAxDWgQByltA5fBR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugyl6nNZF7yNkAEqEIR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwlqtVn58AKx1Csll54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}
]