Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
For the Post Singularity Episode 4000 they will have to change the podcast to Hu…
ytc_Ugz6Pz07G…
G
Most of these Companies are just using this whole AI bandwagon as an excuse to g…
ytc_UgzqAGF2O…
G
I wish Amazon would be effected and even collapse if their massive bet goes wron…
ytr_Ugy6CY9YZ…
G
AI Is a bias bubble that just repeats what human input it was given. Your algori…
ytr_UgwKVFIXH…
G
No job = no money = no purchase= no business= no production= no use of AI 😂…
ytc_UgxOSf92K…
G
Shout out to Karen Hao! Extremely intelligent and lovely lady! Your honey comp…
ytc_UgziRrdAi…
G
I can't believe this is still happening, I'd imagine this isn't a univerisity be…
rdc_kgpd5wk
G
I did the same thing with ChatGPT and asked who exactly is behind all of this - …
ytc_UgyVYVCR2…
Comment
Part of his unsaid comment was that most specialist doctors, lawyers, journalists and other professionals might think hard about AI taking their jobs. That is between AI taking over all control and some "clerical jobs" being lost by humans. The gist I got was that all humans may become irrelevant to AI and its goals. I guess it comes down to whether or not "we" have any say in the matter? Are humans, after all, just a stepladder for silicon superintelligence? Kinda like how animals needed bacteria to set the stage so we could appear?
youtube
AI Governance
2025-02-07T04:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwBuDWKoTVRvXtzmcJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgziLhlzoNze2RilYOJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugza2gAxLSlIZHyOkRF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw0gjPhARdfFrDYr8d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyFVP0aOjIQPb-PpfR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzkszZD6irliM7zBBJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx33Ch6NjEeTwmXNnN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwDdAFloWEf_MvNY5V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwtjNw_Lm-7cUwAv0p4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxncuGljZZKCu411HN4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}
]