Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Artificial Intelligence is a contradiction of terms. Why would anyone wish to h…
ytc_Ugwn1VYzk…
G
The elites are going to kill 80% of the world population … and blame
It on ai ro…
ytc_UgzSFQAcp…
G
No, there is no need to embrace it. Just because it’s good at stealing things an…
ytr_UgxBHFNDL…
G
This is an awesome idea. Smart enough to do the work, smart enough to pay taxes.…
rdc_oh2iobh
G
I wonder if facial recognition technology can be used to fight against the deepf…
ytc_UgxNQuLKq…
G
100% does it make my life better and automate out some of the boring task? Yes. …
rdc_mxyhksf
G
We as a society don't owe anybody driving jobs. If you're working in an obsolete…
ytc_UgiBF9knl…
G
This is exactly what someone who wants more people to invest in ai would say…
ytc_UgzDVP2Uj…
Comment
Government oversite won't matter, computational power technology is currently bottlenecked for AI. But that won't always be the case because of technological exponential growth. In 25 years Joe Schmoe could have a self built quantum computer in is home developing artificial general intelligence.
youtube
AI Governance
2023-11-23T12:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyyXjTzLPsiVvs1f-V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwdrxT1yIN3xcrj4q94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugxd1vvjnuueT7e6dPt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzWWlzCnaE1ul-0zaZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugxk8hyXb-KQj8RqVJl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxazyOYSiLsGQM2x9t4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxRRuR2zqqe1_Om22x4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz93K-qM5KUdXOFvtx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"unclear"},
{"id":"ytc_UgwHb1tUzIvVBygiQRd4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgySoa9MwZHrkHkuDaN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]