Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I hope that more people talk about this - when Ai and robots replace human work,…
ytc_Ugy_NkwqT…
G
Don't listen people. Nothing will happen this ai is just selfgoogling bot needed…
ytc_UgyoHGFUS…
G
AI is a fun lil thang to goof around with, idk why everyone had to be so black a…
ytc_UgyV0anZB…
G
I loved the AI personhood debate! My two cents is that when we give something a …
ytc_Ugwq4mcMN…
G
All in all, AI is a neat thing. The only reason it's receiving so much hate is b…
ytc_Ugx1cgAdu…
G
If you build a robot that’s supposed to help you make the world better and if th…
ytc_UgzgQey6r…
G
This idea is a workable life assisting program. It is Voluntary, and has appeal…
ytc_UgybIYRcJ…
G
Cool story. The sun will take the planet back to the stone age in a timeframe t…
ytc_UgyoUxhhL…
Comment
How naive of this scientist, couldn’t see this coming… when humanity split the atom, we made nuclear bombs with the technology and nuclear reactors that created weapons grade plutonium …. I’m no AI expert, but the notion “bad actors” might run with the technology was ALWAY’s there
youtube
AI Governance
2023-05-03T04:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyHTqB50uYhZUnBGcd4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw1DjH8_IXJLccQgRl4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxYUtFPQDRdKFpbgvp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxsRjC25LnC5q0Ek454AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwRIvK1mE-3e3RCMWN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzFUA1U0lGpoQ5DjLx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugw_Mr95n0M1O92jmgN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyDY26feX3uOdV-Yph4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugzh9toEQ9SMPA02rpZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzKRW73BHXulY5GrLJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]