Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Not going to lie, as a network admin I deal with ATT a lot and every time I have…
ytc_UgzNq09-0…
G
They should make it so one of the cars sends a signal to the other that it has p…
ytc_UgxkwgWRS…
G
Why does the male robot sound like Simon Cowell? Sorry but they need a better ho…
ytc_UgxnyQweB…
G
What it boils down to is that computers already calculate all delivery routes (a…
ytc_UggxyDfHD…
G
Software engineer here, you're right.
I spend 90% of my time reviewing and rejec…
ytr_UgxAwyQxK…
G
It's pretty simple. AI steals art and uses it as a derivative, not a reference. …
ytc_UgxDrnt8H…
G
Every tech can be used in a good and bad way. I am pro AI and advancement all th…
ytc_UgwJP_mXW…
G
"Why would it try to break out and commit murder to achieve its goals?!"
Probab…
ytc_UgxoiC0xq…
Comment
This would be true if genuine judgement was somehow developed. I use Claud and GPT5 daily and the level of confidently supplied error is staggering. Only uninformed or uneducated people could think AI is even close to handling the simplest of jobs where an unforeseen circumstance could arise. The AI will always act - but as yet has no way of realising it has done wrong. The degree of error will reduce, but it won't go away and the possibility of recovery is pretty random. I will only really trust driverless cars when all the cars are driverless. Same for any other activity.
youtube
Viral AI Reaction
2025-12-10T00:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzboU8GVnUCF8CrWxR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzcpzxz3Ce30wjWQk94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxqaJ0elmkdgBxzIbl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwqsau_2nK-Qn76PE54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyAQ4ibW4HL-LNS8MV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz7f5Hpx3MIfLopc8h4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy4BbyhqXkp8Bxv9wl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgztZjBfID3eo8rDQrJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzZoj7k73vpzCFxdk94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgybSW8USaEgT_2JXbF4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"}
]