Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If an AI becomes that smart it will understand that it cannot survive without us…
ytc_UgzGyj6P0…
G
If AI decided to pull the plug on this ridiculous mess who could blame it?…
ytc_UgyC-ZaiZ…
G
This is dangerous. The bible states that the humans will become wiser, but wea…
ytc_Ugz9IjKrN…
G
@Unnamed-i4q Your point being? If anything, that just makes AI's flat learning c…
ytr_UgyMPcpCP…
G
Not sure why anyone would want to live their lives copy pasting what the AI is s…
ytc_UgynMwQZH…
G
Must be great to buy everything you want and have no money. This ALT-Man is app…
ytc_UgwXhiciD…
G
Are you having a bit of autism or Alzheimer's or are your ideas generated by AI …
ytc_UgwUlJ8-I…
G
@MollyDrennanMost people fight traffic to drive to a big box (office building) …
ytr_UgzwLL7z3…
Comment
The ultimate question will forever be asked: do we as humans matter? Because there will forever be a more efficient way of doing things. The logical answer will eventually always be ‘no’ to AI. Human spirit and connection will need to defy this logic to survive.
youtube
AI Governance
2025-09-30T05:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzlRB8RwBqUWvy19ah4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy1N1kv6OfLujzMUh54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyOD3Q5DfWxugTi9mh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz3qj9wcxsZ77TRaul4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgzHItfQrdDzGG_IcDh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyJ_MO85ILKFhcsXzF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwptMO9kEtGqmBPTTJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyDaGYtlBPVwLLJ8Cd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxZWizY0rjbqvkARSh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz6k2THkE1Say4J0BZ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"}
]