Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
She didn’t make them they were deepfakes. Someone else made them and profited of…
ytr_UgzkrbwW6…
G
Well, regarding some people's attidute, things have gone a bit too far I guess. …
ytc_UgxG5p3It…
G
I trust dumb AI more than people because it is dumb but honest. I also believe t…
ytc_UgwFy_nDC…
G
As overhanded as this is going to sound, there simply isn't any other way to put…
ytc_Ugx52BnGL…
G
There are people that want a ai robot future, with autopilot cars and planes, en…
ytc_Ugx_6dSfQ…
G
Someone on reddit said Google is being dismissive about ai concerns because Goog…
ytc_Ugyzz3pBl…
G
I don't think AI art is good, just like other people say. Every time I look up a…
ytc_Ugzrh4Qwk…
G
how about even if you see a real celebrity endorsing a product for real that you…
ytc_Ugwu_PkXN…
Comment
People need to realize that these iterations of A.I. don't have consciousness. A great example of how they approach a problem is analogous to humans (like Elon Musk said) that when we build a highway, and have to destroy an anthill along the way, it's not that we hate ants, they were just in the way. The same holds true of A.I.. Another great example is the "paperclip" theory, (Google it, it's worth the time to fully understand the "mindset" of A.I.)
youtube
AI Moral Status
2023-03-02T02:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugz4EmRqAG_hTLCUdUR4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzH1ivb-HqKM1oe3SR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz4d4DD_VoZtBcUROx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxIi9mCVnjfTwVKZpx4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzMVnsrOjH9lfmzE494AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgznWLFf3xjCMQwah9Z4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxuxdaKwn77-p24gKt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwcALLSUchDYTLXNup4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw3QoJyVtXptX2TCU14AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxJcvxa1nnld8QnV9x4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"}
]