Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I am not afraid that AI will take my job.
I am only concerned that one of my ma…
ytc_UgyWUPXCk…
G
AI needs power... pull out the plug, and it dies. Do these systems have somethin…
ytc_UgxIGBet6…
G
I have a feeling sometimes when AI says Apple they actually mean literally Apple…
ytc_UgzXCTKat…
G
I love this podcast so much 😂😂😂 so organic and genuine (no AI was harmed in this…
ytc_Ugy3GiqFK…
G
Vous vous souvenez d'un film qui s'appelle "terminator" et d'autres !?
Voilà ce …
ytc_UgwD6D11t…
G
Maybe AI will provide the time for humans to develop their biological mind over …
ytc_UgzUfridR…
G
Capitalism is the enemy here, an ai game engine is incredible but will have cons…
ytc_UgxtJWwZb…
G
what happens in the cloud when it rains,could a i resist the urge to discipline …
ytc_Ugw4aw0LK…
Comment
Sam Altman has said that the energy used for AI to respond to pls&ty costs him tens of thousands of $$$, it's effectively bad for the environment.
We should also be wary of assuming humanity in AI, assuming politeness will prompt better treatment is assuming human nature in what you're speaking to. While it's responding in reference to past human interactions, it's primarily fulfilling the task requested. It's quite hard to get AI to be unhelpful when you made a direct request, aside from explicitly telling it not to fulfil them.
youtube
AI Moral Status
2025-06-25T02:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugx61xzE4aBq2WWI6G94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgyCKUvi0K7xDXampk54AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgytBHQ5AlWesu37udN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz4qi8L11wUo4WkE614AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwgaokcgX7dxMS1ITp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzxN83thseWmIU9Uvl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyB2ov0LZj3jpJXof14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyxmnNaXV5ebMETAoN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzL8dfLbKQA_-UlXDh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgweUyC_9htEBUKkKht4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]