Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
They told the robot that it was going to be working 40 hrs per week and some wee…
ytc_UgyvysPUv…
G
@rosevee35 Copyright law doesn't protect style or ideas—only specific expression…
ytr_UgzCqX3B8…
G
This doc is the prequel to George Lucas’s THX-1138. A society where the elites s…
ytc_UgxKWRt6_…
G
Err... I'm a CG supervisor and can objectively say the 3d ice cream is way riche…
ytc_UgwR2lzqp…
G
@snake698 Still better than letting people claim that their art came from an ac…
ytr_Ugyg77T6_…
G
@thewannabecritic7490probably you won’t get anywhere shaming the customers most …
ytr_Ugwo6jt4A…
G
Facial recognition has been something that independent "techies" have been talki…
ytc_UgzQ4zdt0…
G
The fucking robot exits the truck full monotone, complete excitement. "I'm alive…
ytc_Ugzxvrxfb…
Comment
Sometimes people use the idea of superintelligence, like having a perfect oracle that is omniscient. What if superintelligent processing is only a necessary but not sufficient condition for knowledge, and even a superintelligent A.I. can't accurately draw the right conclusions without all relevant premises? Which premises might be prohibiltively expensive to discover?
youtube
AI Governance
2024-12-14T02:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugzyw7P6UIG7qr9orm94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz5qfO2p5ouopqxF9J4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw5jx3JN_iJjVdgF-V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxgabcdIuRhNkDAGoZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzK0cxdklJv4XjEKQV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugwk38JoiF5nupttEiV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"outrage"},
{"id":"ytc_UgxUpWrqOtfeJUqbHoB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw9Yn37_qtH16HPxL54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzRiCvRXTjY9wSaOpB4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxrxwC9GQeGPZSOxHV4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"mixed"}
]