Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI Colonialism narrative. Mega money. All the fuss of being replaced is yet a…
ytc_UgxhVhjai…
G
I experienced the same thing when I tried to bring reality to the conversation a…
rdc_d1kve11
G
Super intelligence? How smart do you really need to be to outsmart humans? How m…
ytc_Ugzi6HIYE…
G
well done for showing that AI is such a bullshit term. I had employers asking m…
ytc_UgwAJfmo0…
G
I work as a mid-iOS Engineer for a large company, I can tell you for sure it’s n…
ytc_UgzVqj3bP…
G
https://youtu.be/O-2tpwW0kmU?si=OARuFBDIjhvEaAB7
I always like to share this vi…
rdc_katyeuy
G
We can't continue on the system we're on it is terminal to us. This notion of a …
ytc_UgymqlHw6…
G
so ai will instal pumbing instalation or do clean the trash canals and so much m…
ytc_Ugw8FJIbd…
Comment
There needs to be an AI oversight committee that determines "taxable" amounts based on AI outcomes. In the example given about doctors, if AI replaces $XXX dollars of labor, then its taxed at XX% and some "AI humanity fund" gets $XX. All AI "bots" need to be registered, and identified (like a patent) back to a registered operator of AI (who also pays fees to operate an AI). If it's determined that an AI bot is producing "outcomes" without being registered through the appropriate AI oversight committee, then it's considered a threat and simply turned off. If there is no stable environment to tax AI...then it's threats to humanity are truly limitless.
youtube
AI Governance
2025-06-16T10:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugwm4sFyVyJN6Ndhj7B4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgzJjUbs8ZS0K6daLbB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyOseFkqfE_8ZIB_PN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwTvgg3gwuBsK6XbKd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyPY5lBG4VxfxBp-vh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxgaUF0DGK2aleF-VN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgzIlELsiG9CSGaf7lh4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwlD_wJXTUP8yXoHrt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzWZj-jgR4ofP354jl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyxdUovt_6j7ZruZOt4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"}
]