Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
To be fair, society has been suffering from a lot of delusional psychoses for a …
ytc_UgyKEwNo8…
G
Huh. And here I thought Robotaxis would be gold-colored two seaters, like in the…
ytc_Ugxf6sXNu…
G
@aliceDarts anything that makes you feel something is art. Current ai is just co…
ytr_UgxgRCKK-…
G
Proof that AI "art" can be used as inspiration, absolutely not as the final prod…
ytc_Ugyfs-6Ms…
G
The only way manufacturing "comes back" is if the automation beats the economics…
ytc_UgyrvEMB7…
G
Imagine spending anywhere from 1-8 years to get denied your diploma because your…
ytc_Ugx5ZmJ3L…
G
When the job losses escalate out of control, things will balance out and settle …
ytc_UgyTtAoUC…
G
@thesong7877 I'd rather see a world where AI takes over menial work FIRST, not m…
ytr_Ugw_uYiW1…
Comment
We are at least 100 years away from artificial intelligence. What we have now are much faster computers that still depend on the people who program them. Artificial intelligence will include part of our physical brain and a superfast computer combined. What we have is a joke , very far from artificial intelligence.
youtube
AI Governance
2025-07-14T19:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyPG2vDXaihuJ7VefZ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugws2l-WckR1OPMB22Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyYSiTuQhJNRN5sZnV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwflwvMMqrQUNdYcc94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw0YaZRxXrg3gy9-dF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzU8DqHK7hBVDhHrNl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwy8CK-hJ4YFUP3wsN4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgxDfWvyQXJs3-4Dgnh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz3ERBoMO7PKDOhPX94AaABAg","responsibility":"government","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugz5uVbhM38PrICYre14AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"mixed"}
]