Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ai is already taking over commercials, give it a few years and a completely AI c…
ytc_UgxZBgDVE…
G
This guy just spouts a bunch of his opinions as if they are all facts. He say…
ytc_Ugwq0ELkX…
G
Yeah the fact that my school had a contest to make a poster for our musical, and…
ytc_UgxaQT8Hv…
G
The answer was wrong. Since math is objective, the AI should have calculated a p…
ytr_Ugzoiqacw…
G
i think "AI art" is only good for when you need a cheap way to get a drawing and…
ytc_UgwPOBXhy…
G
Relax! It is obvious that he apparently has absolutely no idea how LLMs works a…
ytc_UgzAZycWW…
G
Very enlightening except I don't agree with all religions being the same. Christ…
ytc_UgxOGg8UN…
G
Bill & Melinda Gates foundation stopped the Oxford vaccine from being open s…
rdc_grqqcwb
Comment
The underlying problem is that when AI replaces most human jobs, and humans no longer earn income they cease to be tax payers. Without tax dollars local, state and federal governments will not be able to function. All social programs will end, emergency services will end (police departments) civilized society will end. If you make humans irrelevant, humans will cease to exist.
youtube
AI Governance
2026-02-19T09:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxG6f60ZlObElHlfIZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyIc232zuEiwu86G2l4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx8RIZz4E92pviwkpR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwGJXSZZJjCFC8v01Z4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgytHqr5HIoRG7i3L3Z4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyfSQWnZTGDaM5z3J94AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzMBmgz_v8tTP2c2RV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwuAxbSkPRNHLhDC3N4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz1JlfLfINJ0LgC3aF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwN-KahCxlWMYhlkd14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]