Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This video completely fails to deliver on its title. It doesn't show how bias i…
ytc_Ugy-xNOm5…
G
No you aren’t. An AI bot will be able to do your job easily; eventually.…
ytr_UgwfWakac…
G
How is AI being used right now? To generate videos that warn of the entanglemen…
ytc_UgzPPLr7t…
G
Been an admirer of Yudkowsky since spring 2023. But I try to believe that AI is…
ytc_UgyaxYhHV…
G
Hard note they are never mentioning. If u say ai can't train and use remember th…
ytc_UgzXJu5to…
G
I also use AI exclusively as a slightly better thesaurus for writing. It's so ea…
ytc_UgywgBmWP…
G
AI users are all alt right weirdo plagiarist who sometimes pretend to be normal …
ytr_Ugz_WgA6D…
G
Thank you Bernie!
As soon as I realized Project 2025 and techno-oligarchs like …
ytc_UgzSAtrQU…
Comment
So the whole robots and AI taking over the world and enslaving humans is not the scary part for me, this has been predicted by so many people even the AI. However, my problem is this, when humans create robots and AI to do the jobs that other humans were doing to make a living, what will happen to the human population that is trying to make a living and survive? Have humans thought of that? Do they know the effect of this technological development on the human race? Why does no one think of that?
youtube
AI Moral Status
2023-10-06T22:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgwhUAWM9-_PS93RejZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzdpZacPKGPF_V2r2p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx4J3KzA1TlJDQhMwt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyBUKuvzUs8HQJGsXB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzO7ngG3rw23UB_wUR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgyI8ll24EVWRiJBsxB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwrpo-JQQfst_FvEtx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyXE6wrrp6XCBod7oV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzn02_8cY-UB0yvgqp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxER84T96tSU68L16J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}]