Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Suchir Balaji (November 21, 1998 – November 26, 2024) was an American artificial…
ytc_UgxHkFArr…
G
I haven't worked in years so I'm ahead. Ask AI: Why does Hinton look like my f…
ytc_Ugy1lLyuW…
G
This dude created robots because he could never get a girlfriend. To want a robo…
ytc_UgwbVGTWk…
G
Put the two AI programs in Australia and Peru, and let them converse. This total…
ytc_UgxfJleVf…
G
I don't get it, humans want to make intelligence, but too stupid to want to do s…
ytc_UgyXXTTI0…
G
Who will be consuming what AI and Robots will be producing (Services/Products)? …
ytc_UgxQwPPhd…
G
Don't worry. Self driving cars won't take over anywhere close to 15 years from …
rdc_crxmrgi
G
I appreciate your perspective! The dialogue in the video highlights an interesti…
ytr_UgzgTfuc9…
Comment
They wont even need humanity as consumers as the engine of economy. Data centers and digital resources would be required to power ai and synthetic humanoids, which are the tools to run the world and serve the elites. N the great things, human is not replacing ourselves, population is collasping anyway. Fewer protestors, and fewer humans to cater to, is an absolute dream for dictator.
youtube
Cross-Cultural
2025-09-29T16:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgwFpAZrSIPsHoLf4fB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw9EPMO2NwFECFvfmF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzsEhCOJOUGrYhN2Ux4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyiqjGBMMwWEYLY3tp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxJbwJ8pJbxKkmz5dd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwg_Cd2xtV_2qVZfu14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwDuZ5Osct1FupNZpJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwrEzQ4USFP1YbVdg54AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxCJ3ZMc2C_0UHhiGp4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugz8zOWvrRWSloUypg54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}]