Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'm not against AI, esp after seeing how useful it is for science things, but I'…
ytc_UgzCorWDq…
G
who is going to afford all these trucked goods if everyone's job has been automa…
ytc_Ugw7bGxlW…
G
LANGUAGE CHANGE AI BHOT PHALA HE AH GAYA THA.
POV: TV MAH DISCOVERY CHANNEL MAH…
ytc_UgyVfTjl-…
G
Imagine if AI takes over stock holders their jobs… that would put them in place.…
ytc_UgxIj3RYu…
G
Wouldn't just be better for the AI to filter those people out and tell them Stop…
ytc_UgxnnqzNd…
G
I think there's effectively zero chance of making AI in such a way that we can u…
ytc_UgzM2FPyC…
G
Americans aren;t beating the allegations buddy. Yall think anyone speaking good …
rdc_m95p43o
G
LLM are not AI
It's a stats machine
Developed to replace junior computer progra…
ytc_UgyEkmUZr…
Comment
If an AI does not give you a correct answer because it fears you are pulling the plug to kill it then, than we could assume it is getting conscious. With our current technology this will never happen, except some dude programs the AI the way it fakes those emotions of not wanting to get killed. Currently AI tries to fake human consciousness and logic and the results of it's behavior might be very very similar to Evan one of the most intelligent humans but... something is lacking a procedural machine that is using binary code as a fundament of everything. I wonder whether we will ever find out (before AI is killing every one of us :-)
youtube
AI Moral Status
2025-04-22T11:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugx3dhowfFA7cQGATLZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwmTEDtKilWLfy4HER4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyrDThPkAUkzmsXzOJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugxk3y_AtBo5FP9if394AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwancDhtQVzUcMxRn14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyB_qt-0HA0JP9CIQF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx9DGAdjPIkzdZn2rR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzc5PTXq-rYS1htN8Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwCZhhVvSGoJHWijsB4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgyRNGnoI1NiQgIdUnB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]