Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
They got a $5K slap on the wrist. They should of been disbarred for doubling dow…
ytc_Ugz6oGnR0…
G
I would be happy for AI to do my job so long as I still got paid. If AI takes jo…
ytc_UgwcDWJ5_…
G
You are being ableist to the AI. It would be like saying a quadriplegic decided …
ytc_Ugz_NSWmE…
G
so it all makes sense now , they want to carbon tax humans to offset the cost…
ytc_UgzGpsMnc…
G
This is why they don't want us having automatic gun but if you know you know. Di…
ytc_UgyLoCqs6…
G
At the end of the day it comes down to other countries like china getting to sup…
ytc_UgypkCOxV…
G
I've stated from the beginning that AI will be as wicked and decietful as the hu…
ytc_Ugw5TCS-l…
G
@SnowyKoneko 15 minutes everyday and get better? And when i get good? In 15 year…
ytr_UgzJ-6De1…
Comment
2:12 “...won't that confuse people about what the truth is?...”
What about the 2020 presidential election and January 6th? Seem pretty obvious that wasn't AI work!
The issue here is the same as mass shootings. How to control who has access to which AI technology and for which usage. Or maybe, should we put a good guy with AI against the bad guy with AI?
For now, AI alone doesn't do anything! He desperately needs a human to give him a prompt!
youtube
2023-05-08T17:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgweJFIux0o0-IMhJ9h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxvS-3ghvGCDFYOZod4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwTYOR98ot11MhtGIV4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy04J7PUpcnCna9BC14AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwiSXXwOa30ahcSMNp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgygCM6MNfCtQlDXrd94AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugwi8KGmECvGXry4VAt4AaABAg","responsibility":"government","reasoning":"unclear","policy":"regulate","emotion":"resignation"},
{"id":"ytc_Ugxw1s9lnHPxTjmPJFN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzebUsdzliHCsY2Zo14AaABAg","responsibility":"user","reasoning":"virtue","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwlJGyoROrrVyhvrg14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"}
]