Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
There is couples of problem is refuel and thief. We seen some people are bad can…
ytc_UgwSFlgO8…
G
why cant ai content have a watermark that clearly identifies it as such, they pu…
ytc_Ugzbhci52…
G
For being so smart he's really naive to the point that if the Canadian Governmen…
ytc_UgwzhkIlE…
G
There is literally nothing you can do to stop this. Ai is out. Anyone can do any…
rdc_lgppomd
G
@jonathancrews169Naa just because he couldn’t articulate it doesn’t mean it was …
ytr_Ugxc_N2Ba…
G
Outsourcing your thinking to a chatbot is hazardous, partly because too many hum…
ytc_UgyTSejWe…
G
Not real looking at all but the Democrats that tuck their male reproductive part…
ytc_Ugyn2R8y-…
G
I think you have to consider the use of the training data for AI. Unless you own…
rdc_jwvqofk
Comment
It all comes down to statistics, they will make mistakes, but if they make mistakes less frequently than operators and bonehead Kuwaiti pilots shooting down friendly F-15s then they will be adopted.
Is the same with autonomous driving, even if it is as good as a 16YO driver, if it doesn't get distracted or tired and that makes it statistically safer it will be adopted even if it kills someone from time to time.
youtube
2026-03-11T09:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwxjFYESy6tomjwN4d4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwhSkA-8HdrOjj_7dB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugy0gQLgLOo6OkcUUVt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwaPyWNZQ6ha3J2eg94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgymZ5qYdd5I8ml6IAh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgxTT4_0jF2ORMzHTDR4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwTiWKgZdR_o3XQXdJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy72D3wTYuNj6RjDVt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgynN6nUrxQ7PRQKWVd4AaABAg","responsibility":"government","reasoning":"virtue","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugza79EGRXrOH4F_VZ14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]