Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We the people need to not support these companies that are replacing their human…
ytc_UgyqCu5Ke…
G
the only way this works is if the ai companies start paying universal basic inco…
ytc_UgzCMNjdW…
G
The guy says, "Just a hobby." No one is worried AI is going to steal their tenn…
ytc_UgwQ9bQz1…
G
A survivor handout…. Oh goodie. I got a better idea.. how about you distribute l…
ytc_UgycQdTET…
G
Could try limiting the amout of ai to employees ratio with some werid legislatio…
ytc_UgzYwJH1J…
G
no it wont, the issue is if AI and robots do everything then who is the capitali…
ytc_Ugxwl-gqD…
G
Ai art takes some creativity to get it to spit out an image you have in you head…
ytc_UgwAoc48F…
G
This is my first time watching a video of yours, and I must say, your art style …
ytc_UgzQF9qHg…
Comment
The foundation of AI should always be that the final answer to any issue is resolved based on a few basic criteria. Is the action to be taken kind, compassionate, ethical? If so, proceed, if not, reconsider alternatives.
youtube
AI Governance
2025-12-06T17:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxezesmSMPcQF3FEht4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw7qlOKQCaVE-O_jd94AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxlIUmop_aySlwFA714AaABAg","responsibility":"company","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwK4qt3eHf9h_-nklp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzMXd1RTA_vVqKkJ7Z4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxwbLVs8sboawNy1s14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgxsGik4IBQgvvs8v5p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyGqcaO5jMkWR52cVd4AaABAg","responsibility":"government","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw9c-Sc0rU-NJjt7Lx4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgymjetgbBorg5MJO_B4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]