Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Meh. Eric Schmidt might have a vision, that doesn’t make it correct. Specific to…
ytc_UgwAgpIWN…
G
In the last section from about 14:44 the person talks about how it would be agai…
ytc_UgxAVAkx7…
G
Buz yeryüzünde yaşayan insanlar ve cinleri anormal yaratıklara dönüştürmek norma…
ytc_UgwUmk2gL…
G
@codirennke1109 yes. We can teach it that humans and AI will be synergistic. But…
ytr_Ugw1xJd33…
G
well really the whole idea of creating an a.i is to make things easier, what wou…
ytc_Uggfx-CZ2…
G
Imagine AGI is achived, its placed in positions of military and social power, it…
ytc_UgzXVGITN…
G
OMG, imagine AI cops programmed with the same training US cops receive now... St…
ytc_UggenJTFf…
G
The Waymo is like a Seal Team 6 operator with night vision goggles on during day…
ytc_UgztmJhFt…
Comment
AI is not a problem. AI is a tool. People need to learn how to use it, but most people will not make the effort.
Here are two reasons Americans struggle to make ends meet.
Low IQ
Low literacy
IQ 85–100 = 113 million below average people.
IQ 70–85 = 46.6 million challenged people.
That's 159.6 million people who most likely hold low-paying jobs. That's 50% the country. Now add to that the fact that 54% of American adults read at or below a 6th-grade level.
Intuition tells us we are talking about the same lower half of the bell curve in both statistics.
More than half of the people in America are essentially stagnant. AI will save them.
youtube
AI Governance
2025-09-04T19:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwO0wFEWkxR55XV9vN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy0FBIFMy91geIUXvV4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz-k5gd0GFFCdQHRCJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgyrBKPjzJoazYmeO814AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzTQ-PRuH8GYhdxIFJ4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzZyl_ISG5xo5bLmAt4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx_aGyHgYRJoOQsSPF4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy9NmAHhMlGtjZrj7h4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxBMBAmoIbOSgmtHuZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxm2ejTtLRvaq0Z0JR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}
]