Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Controlled opposition. This guy is very politically driven. Something doesn't fe…
ytc_UgxWUS1t2…
G
I dropped out of the artist's guild in my city a couple months ago. Using AI art…
ytc_UgzMp5Wo4…
G
It’s wild how Andrew Yang was one of the few who didn’t just talk about automati…
ytc_UgwjaSn0r…
G
That art is clearly not ai I know ai hasn’t gotten good but still people need to…
ytc_UgySkGUej…
G
im confused he was shot by who its not like the ai did it?? i assume he posted s…
ytc_Ugw4NyqIa…
G
Do you remember how back in the day people used to say that digital artists are …
ytc_UgxqOFwPl…
G
AI is insidious, people accept it, here's an example, you ring a company to make…
ytr_Ugwi6_r2k…
G
This is bad.... cuz when one robot learns to kill the knowledge will enter the c…
ytc_Ugz4pYxg4…
Comment
I always said Man will out do himself with technology. AI is and will be just that. AI can’t pay taxes, pay rent, pay mortgage, keep currency circulating to grow the economy. Can’t see AI needing to purchase a car, food, housing, clothes, cable service, cell phones, attend concerts, schools, or anything a hard working American who would be forced to pay taxes would need or want . AI can and will be the ultimate demise of lives if allowed to take the place of mankind. With that said, AI can possibly turn on it’s owner as well. There was a overworked robot that collapsed on the internet. I guess AI has it’s limits as well as humans. Greed has no limit apparently 🤔
youtube
AI Governance
2023-05-03T04:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyHTqB50uYhZUnBGcd4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw1DjH8_IXJLccQgRl4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxYUtFPQDRdKFpbgvp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxsRjC25LnC5q0Ek454AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwRIvK1mE-3e3RCMWN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzFUA1U0lGpoQ5DjLx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugw_Mr95n0M1O92jmgN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyDY26feX3uOdV-Yph4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugzh9toEQ9SMPA02rpZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzKRW73BHXulY5GrLJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]