Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Using "AI" to write code sucks. So I don't. I was hired to code, not be a prom…
ytc_UgycA0FX4…
G
8:58 and here is the mistake humans make. They will doubt AI is real, think of …
ytc_UgwGlH_Fk…
G
@makenshi2k i wasnt! With these tools i can still deliver. So it really helps t…
ytr_UgxRjYYGf…
G
this was made almost a year ago, and im sure ai is already way better than it wa…
ytc_UgxO00scd…
G
Thank you for sharing your thoughts, Vijay! It's important to consider the impac…
ytr_UgyDS4d8j…
G
Hi Sofia! It's great to see you engaging with our AI content. If you have any qu…
ytr_Ugywo2z7y…
G
The idea of a single person empowered by AI handling what once took a team is wi…
ytc_Ugyqbb1XJ…
G
In 3D modelling class students learn by creating fake celebrity models. If you c…
ytc_Ugyt2fZ2O…
Comment
🔥 *WTF! of course not! ground yourself.* A.I. approximates human behavior in its attempt at being unrecognizable from human interaction. It CANNOT however be sentient, its wires and programming. It will ONLY ever be a marionette. Might as well give my toaster human rights.... anyone pushing for this means they are working for a company that doesnt want to be *liable some day* . Also you are assuming that A.I can achieve sentience, where is the evidence for this?. Have none, yeah thought so... non issue
youtube
AI Moral Status
2020-07-08T10:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugy-JKQrSLL2ffZdJPh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxJEDKgvEoZ_eIIZrR4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyOgNugBZppAajoy9J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyXbQDK2kQBkxVv5NR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxz1sAmqzZ3G8Kxc6B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyRIfj07c5wpIjTEYF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzvQr7v9KVXOCYA5XN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwFdQVCpNH36Irz_7d4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwrBsqI-qr_8BQ7EH14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyCiC-we3JQ8eAk7HR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]