Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
WHINY HYPOCRITICAL CHILDREN ---DISCONTINUE CARS AND BRING BACK HORSE DRAWN CARRI…
ytc_UgyQbWAqt…
G
matrix and the terminator are a documentary. Russia and the west will not win, t…
ytc_UgwhDx3fN…
G
People crack me up... AI is a &□_■&●=● program, and it's wrong most of the time …
ytc_Ugww9s3-y…
G
Why has everyone including me WHEN THEY TRY TO BREAK THE FILTER IT DONT WORK BUT…
ytc_UgxYooLIJ…
G
They saying its easier than traditional draw while they just have to ask a ai to…
ytc_Ugz629ftX…
G
1 human or 5000 ants.
AI clown: "well 5000 lives is more substantial than just …
ytc_UgwkGthHN…
G
The automation endgame is that the people who control the robots eliminate every…
ytc_UgwRrtzYU…
G
Exactly. AI isn’t for you, it’s for the shareholders. Managements needs a new sh…
rdc_m29obk5
Comment
Regarding the human form question:
If an AI is super-intelligent and can control any form - AND the world is currently optimized for the human form to interact with it, why would there be a need to use a different form for a super smart AI controlled robot?
Personally I think starting with the human form for robots makes sense. Over time the form may change, but why worry about that today?
youtube
AI Governance
2025-12-04T20:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz9MCy7-LWsK6ywqNd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxUVl1Ne2CGo_UhLEF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_Ugy6VJPQ8LAtdmOrcXt4AaABAg","responsibility":"user","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugzc920NKu5EhIZhzJ54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxVZIrEnPauMp9KOdB4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwqAoXCctafZpo6GW54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzpWG10qIYRgXqFhdd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugx2nppn9o6X_5M9Y0t4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyuI4KxQ3wqDFjFWuF4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwY94ItOEC4OZ2KAHJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}
]