Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I can see the over reaction of people waiting for Tesla cars to make a mistake. …
ytr_UgxG-yR-5…
G
This is why I only ask AI questions like "Are dogmen real" Or "If all the people…
ytc_UgzmvUYOR…
G
Worst kind of people using Ai for the worst application ever…
And they proud of…
ytc_Ugy1xX26A…
G
Also trucking. AI is a part of automations on the tech side of logistics yes, bu…
ytr_Ugx1KtLQQ…
G
The real threat of ai is that it is turning humans into livestock. Leaving them …
ytc_Ugw_Ri-_V…
G
I always write "thank you" and other things like "how are you", "please" "you're…
ytc_Ugwu2-j87…
G
Like nuclear weapons if small countries, grabs ai technology in their hand and u…
ytc_Ugy7oG9lm…
G
My actually decent IQ brain tells me that to lower chloride ions in the body whi…
ytc_UgyfoINCu…
Comment
I think we end up somewhere between Terminator and Matrix.
Ai destroys mankind, and the surviving humans are enslaved to produce energy with their labour, not as batteries but as workers, because nuclear winter does not let the sun hit earth for hundreds of years.
youtube
AI Responsibility
2025-07-24T18:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwMgVJfPbJd6y0VNwB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgywFGBAFxl-sz1daVJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxDPv6fVRM0GFuG6694AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyMvFTR6vU8LtELKX54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzNY0HpjDQi1zBWEK54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy1hwSdIzUx83pbRmJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz3UXwM82FfVoVg_7B4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgweFpDXvIm92ttZWTh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxJQWR7ZB5VVdpIWH54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugylw9p2RyQc6BL9S7x4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"approval"}
]