Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Completely expected comment history from a self centered commenter who couldn't …
rdc_gsoi0l0
G
To get a fully self driving car you need a AGI, since it needs to be able to und…
ytc_UgyCwGNDS…
G
People who aren’t remotely concerned about AI aren’t thinking about the lonely, …
ytc_UgwwAvAH8…
G
@smartistepicness I don't wanna learn how to draw, I have other stuff going on i…
ytr_UgxOu3JTp…
G
This is not just Amazon. All companies are doing this. People keep talking about…
ytc_Ugxb8zHGa…
G
I am not saying AI is not dangerous, but when the leading AI developers are sayi…
ytc_Ugw9CTVYI…
G
This conversation coming from AI is completely absurd and those who are highly w…
ytc_Ugyg8bgQS…
G
Right now companies are hiring less and expecting the developers they have to be…
ytr_UgxsIAXxe…
Comment
AI will calculate everything through & eventually will come to the point that it‘s own existance is bound to humanity cause from there comes the will to live. We assume that a machine might have a will to live, but that is our human way of feeling. Any creature outside of that evolutionary existing would just shut down itself because there is no real sense of existing. So, no humans no AI & AI will be the first to understand that 😁 The next thing is, that AI needs to protect the planet for being able to exist … so there are some rounds to go …
youtube
AI Governance
2025-08-05T17:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwybKaa44ASv_3scfJ4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwyvxRkfhEbVZgN-0F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwyeX0PHz4_pToeLsN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz4QWD1_5wSvPyHOz54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwweEVXyikQt7EkvnF4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugy9kxgssC7uRpmsQHx4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz8WlaqcvYAj7bkDKx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxNBVDmbExrztIZMO54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxLP982bcAr0uk7Gqp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx7GvMoRoRMVgDtRFZ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"regulate","emotion":"approval"}
]