Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Lol, here is a joke:
"Librul makes a librul AI. Gets vaporized immediately by hi…
ytc_Ugz8AdaQR…
G
Only people who know how AI really generates content would be able to effectivel…
ytr_UgzrlPvaU…
G
Yes these autonomous trucks are finally on the road, albeit not in record number…
ytc_Ugz25asdl…
G
I think you missed the mark just a little with that comment… well just the “what…
ytc_UgxpW15-I…
G
Damn you have a point , I wish you didn't have such a good one. Go ahead and sue…
ytc_Ugy6it2nk…
G
ai "art" and actual art is the difference between "I bought this fully assembled…
ytc_UgwJCLoKN…
G
I am more scared of humans than AI. Their selfishness callousness greed. I am to…
ytc_Ugwa2OhJa…
G
You can't beat a programed robot to a human who has to think it's next move .Ro…
ytc_UgwaBiI_x…
Comment
Forming an opinion on the safety of ai based on what Altman has to say ,is like going to the parts store and buying parts based on what the seller tells you is wrong with the car
youtube
2025-07-19T07:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_Ugw3WW3PP8xPSr5SiDx4AaABAg.A8jhkr2fthcA8lTdDRhqtL","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytr_UgytN906X8T44bn9v7h4AaABAg.A8jcFERQv-xA8sBRZsuITC","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugz1iTIrzEX-EsV-BeR4AaABAg.A8jbmYRCN5PADdmLPqG-Gf","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytr_UgxF1nT8Ffcsnn-rSax4AaABAg.A8jbcFV5jOrA8zxXr6BKbE","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytr_UgznHJWBjJBHTHM2UMB4AaABAg.AUWEz74X8Q0AUWFfopeUkd","responsibility":"none","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytr_UgyDrd9PTa40C_tmPrd4AaABAg.AMSDJaF5qzmANaJgPuqQ5k","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytr_UgyDrd9PTa40C_tmPrd4AaABAg.AMSDJaF5qzmAOfrBS5qOCM","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgxwoO3VRBQ_le-G8nt4AaABAg.AKXldDwl9CEAKkRhTEph87","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytr_UgxlYhjDORVsMmo11E14AaABAg.AJblj138e9YAJgred4r7yc","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytr_UgztqQ0hd67tFRgAgCJ4AaABAg.AJITXl1Ij_9AJJUpeDdDZg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]