Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Uber and Lyft drivers pushed out legacy taxi company drivers. Tech and AI might …
ytr_UgyquzfUN…
G
Lets face it, we're all screwed. We just dont know how long it'll take for the s…
ytc_UgzL61U6x…
G
Am I supposed to be less worried about these companies than openAI and palantir …
rdc_mz21l09
G
Education is the combination of learning skills and acquiring knowledge.
Yes, t…
rdc_jvkou9b
G
I think earning a bachelor's degree is still relevant. However, you need to choo…
ytc_UgxR4gOGA…
G
11:43 ask this to any corporation or business as that is their exact mindset, th…
ytc_UgxCyT1bI…
G
I asked AI… how many school shooters % wise are white- it said the number isn’t …
ytc_UgyW7JFx8…
G
AI coders produce bugs for sure. I produce 5x more bugs and take 100x more time …
ytc_Ugzlkng19…
Comment
He thinks we are all stupid. AI is not a danger. Certain people and organizations want to control AI via regulations while keeping it from the rest of us. This will make these people more powerful and rich. But if we all have access to its capabilities, their value is diminished. So he is lobbying for those who want to frighten the public into allowing them to control AI. "If you don't give AI and all its capabilities to we elites, it will spell disaster. " If you are stupid enough to fall for this, I have a bridge to sell you.
youtube
AI Governance
2025-06-17T04:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxkRC_a2Dccj2RnYZB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxJbFatjsIrzsd87Vd4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzcVYeSMNZwhiTE9Bl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"outrage"},
{"id":"ytc_Ugx5zl8Q6MMqj17eVdx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzROW0I-WM2JywRoP54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwAwbSZHIv-VnVfyy94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyvuPKCDNeVl5vMSLh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxMgLa1Pjflys_X4UZ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxI9_TF2dbp-YYPQSN4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugzf4XTU1d1juCubx4N4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"}
]