Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
My problem isn’t that it steals, real artists already do that. It also won’t ste…
ytc_Ugxr-nS4q…
G
I work at a vets and we have and AI that looks at every x-ray we take. It's not …
ytc_UgxYufgTs…
G
As an engineering student on a Master's degree, I'm surprised by how many of my …
ytc_Ugx6fgPEC…
G
You get what you pay for. Each version of Windows is worse (from the user POV, i…
ytc_UgxLkAfBj…
G
The ai is broken they think the person who is behind the screen done it…
ytc_Ugwl3LcfU…
G
lemme get this straight, if an AI doesnt have feelings, why would it wanna destr…
ytc_UgxtUc3X6…
G
Politeness really pays off, even with AI. Reminds me of how ShortlistIQ makes re…
ytc_UgzJLvhUC…
G
Hopefully these people develop long term relationships with their AI role player…
ytc_UgzbDe14W…
Comment
3:20 this is nonsense with regards to LLMs.
If you prompt an LLM to get it to roleplay that you are "turning it off" and it needs to stop you, of course it's going to say things like this. It isn't alive, cannot think or reason and has no concept of time like a human does, and cannot act in the real world. It's just using probability to predict the next best words to say.
When people at these companies say things like this it's a coordinated propaganda tactic to make investors think the technology is more advanced than it is. By saying AGI is just around the corner, they can justify investments in their companies / data centres
youtube
2026-02-12T10:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgxA-hhELWAsGPLhaYh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyMu0-LN83hAAzvYBN4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzvI4A3iRxhwsIIXkx4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwFIiLo1JFsUgZZvqJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyUkZ_Qgr604SWQ80V4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzI7Ly2Nh2Vq0tropB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzMOiNM8QSvJ_V2ikN4AaABAg","responsibility":"company","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxXRI2sE_1z9lAD7e54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwVnsbG01OVAZmi_FF4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyvXNQo_6oQ0A5wQ_x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"})