Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I dont like open AI, it always pop up , when u search its there copying the work…
ytc_UgyXIeP31…
G
There is a chance that Sam Altman of OpenAI as well as whoever is behind Anthrop…
ytr_Ugw1xsTpr…
G
It's not a problem with AI or its data sets. AI is mirror reflecting the reality…
ytc_UgwYWniGd…
G
everyone who read sci-fy storys involving AI knows the danger. Heinlein, Assimo…
ytc_UgwJRUevu…
G
I’m a 15 year old and I’m used to the looks of ai. Even I think I would be foole…
ytc_Ugzk9vvDl…
G
Make a video in which you pretend to be an advanced robot that is self conscious…
ytc_UgyfJIuIw…
G
I'm sure it can not post out more crap than the BBC, even with Ai.…
ytc_UgxT0ps-P…
G
I am imagining a robot Karen that screams: "01101001 00100000 01110111 01000001 …
ytc_Ugy3ALZ5w…
Comment
Isn't it common consensus amongst scientists that the feelings that make us humans *alive* are caused by chemicals in the brain? How would the AI even *be* self aware without a complex proccesing device like our brain?
Surely we'd give an AI a brain to actually be able to be alive if we wanted one to be so; and then that solves the question.
youtube
AI Moral Status
2023-08-21T06:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgySZ6aLxO7ZpreByjx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzhpURSR2IJEDSpv494AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyWgr1V5d5bs3tppft4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwhkKosGzX3vt7JSYR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx6l9iUT3XAEriWuNF4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzK6rJckH_Tb0w0wqJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzQ2GXzis34278cFMZ4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzPTds8zGVirYTk1hx4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxotep83lhNTvUs1cF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxKJBMcpWtO68-y1qV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]