Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
And TODAY IN SAN FRANCISCO THEY WANT AUTONOMOUS KILLER ROBOTS to Roam the Street…
ytc_UgxgVcz3r…
G
In the short term the AI winner will take over ALL industry. We are setting ours…
ytc_UgypJNPTp…
G
When there's no one to sell their products to, what then? Are people willing to …
ytc_UgwjGYquM…
G
For a person of average intelligence, AI junk is easy to spot, the question is, …
ytc_UgwSpvRw5…
G
EU needs to move to centre left…stop WAR and AUSTERITY…stop fighting others wars…
ytc_UgyI9RdM9…
G
I am scared by the time when I go to apply for college to get a degree in digita…
ytc_Ugx2IId2Q…
G
Data mining - mining our unconscious bodily quirks to sell without our permissio…
ytc_UgwTI5ej-…
G
"And I saw thrones, and they sat upon them, and judgment was given unto them: an…
ytc_Ugy-mQmVw…
Comment
Love this content. Something a lot of people overlook is the idea that the AI will decide it doesn’t need humans. There’s a lot of things it would possibly need humans for but one thing I always like to look at it is what if it will view us as it’s creators. Like we are it’s parents. We might be old and dumb but you always want to give your parents a good life. The AI might want to give us the best lives it could give us ❤
youtube
AI Governance
2023-07-07T04:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugyp4oi-W6TemR_eHct4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxO6eJ_TB0z2LhM2U94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxOUuU2w_NvE2obmg14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzZxiSIdML_16ksrx54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxA7W3i5uxMAXWGmXV4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugza8ag9QOEG1QX2V4F4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgxJRBTgStpoz9OSMqF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwFAx4JoyuRA19B7nh4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgwgCXlDrl0yMqVwzcx4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyUB8KoCTUGwlZbAup4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"}]