Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Crazy how humans will just destroy themselves.
Like we won't die out for example…
ytr_UgzvQsWO3…
G
Corporate slop word salad, how is “adopting” to AI gonna put food on the table o…
ytr_Ugy1-6xUK…
G
Don’t replace government officials with AI, use is as an advisor yes but technol…
rdc_ktuakzx
G
@jelly291 > AI on the other hand, takes parts of dozens of images and puts them …
ytr_UgzVRxGEw…
G
I think he is sugar coating it. There will be no jobs at all by 2030. AI and rob…
ytc_Ugzp4IT1n…
G
@mfitzgerald130 We can't know what form this technology will take. What if we g…
ytr_UgxNW9le_…
G
artifical intelligence should be banned for regular people and should be used fo…
ytc_UgzfRwfvM…
G
Trust me, I know for a Fact ( don't ask me how, because I cannot reveal how I k…
ytc_UgwPNKe3s…
Comment
So if we make an AI robot to determine what is litter and what isn't and picks up the litter...you are saying that it can't be in the shape of a human? How does its "form" as a human make it especially dangerous?
youtube
AI Governance
2023-05-16T20:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgzVfiIcMF1l_Je0u6d4AaABAg.9pn0Yb_I47C9ppacM3mqNx","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxHCOVLfCDQFcNmSel4AaABAg.9pn-XVA1Hbg9pn9vFagUKZ","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytr_UgxHCOVLfCDQFcNmSel4AaABAg.9pn-XVA1Hbg9qgzFZv5FeR","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytr_UgwoySkl4MRsQkmBjSh4AaABAg.9pmxWp_fJv79pn3Z5njUd4","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwoySkl4MRsQkmBjSh4AaABAg.9pmxWp_fJv79pnQEnO1f6a","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwoySkl4MRsQkmBjSh4AaABAg.9pmxWp_fJv79pyp3aHSeFa","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytr_UgyfaLTHbc73yxfTcJB4AaABAg.9pmo6znuYqo9pnT6cY66hz","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytr_UgyfaLTHbc73yxfTcJB4AaABAg.9pmo6znuYqo9pokP2m-LWX","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_Ugw24WZzQxwp8FGivEp4AaABAg.9pmku29CVQv9pmmYjVOUJl","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugw24WZzQxwp8FGivEp4AaABAg.9pmku29CVQv9pmpoj3eEWQ","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]