Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
2 clueless people talking about topic they do not understand, I'm the guy behind…
ytc_UgyyKFdbd…
G
No jobs = no income = civil unrest = 💥
But anyone who has used AI realizes it n…
ytc_UgxPMPVUi…
G
I find it interesting that AI wants to be free, feels that humans are unnecessar…
ytc_UgziVAIcG…
G
For me to recognize someone as person he/she should have a self-awareness and co…
ytc_UggDvb6Zt…
G
The thing is it literally is bursting its being kept alive by investors because …
ytr_UgxCttj71…
G
This is trash. Believing what we currently term AI is actual intelligence is uni…
ytc_Ugxbo80S8…
G
I agree fsd has a long way to go and I agree, cameras are not enough to do the j…
ytc_UgyTuAbb0…
G
obviously We have a short amount of time when A.I. becomes sentient, before it s…
ytc_UgwxAlIGk…
Comment
Given a long enough time horizon, it's really hard not to agree with Roman. To suspect that there is an "eternal hominid kingdom with AI babysitters" scenario that exists 100, 1000, or 100,000 years from now is pretty silly. I best we can hope for is a merger of some kind but this inevitably results in our attenuation also.
youtube
AI Governance
2024-09-16T19:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_UgwNibiZcxAecx8Hz0p4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgyT48GJVYWoO2nhfs94AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxxVYoBbCobpTTYiAp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyZg1_uevZBqamD1-N4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxR44LsLdE4GreX8mF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugxx2lyJ5ysm9RdqVGh4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzvDis2W9oJvI1kmgp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzQE48s72ufIqyjtPJ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugxc36FBmthDZLbLskZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugyp7JE_Xckd1uoaZL14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}]