Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Because people are genuinely selfish, ignorant and thus hateful/fearful and woul…
ytr_Ugwd-9mBg…
G
I’m not so much worried about AI as I am about the fascism in the US. AI, I feel…
ytc_UgzAY42Bi…
G
So he floored the accelerator, dropped his phone, was rummaging around the floor…
ytc_UgxuQD8im…
G
Claude community bias and hyp all over YouTube.., I use both, the only thing Cla…
ytc_Ugz-St1J4…
G
Even if you can get good grades from using these trackers, what good does it do …
ytc_UgxnCl2OX…
G
As SMR said.
"you're not reading between the lines".
"FSD/ Optimus" isn't the i…
ytr_Ugz_Exjir…
G
Can we stop calling people that use AI ‘artists’ because they are far from that …
ytc_Ugy-soTPc…
G
FREE PALESTINE..... I know zionists like PBD don't like to hear it. He drinks to…
ytc_UgwAJdLJb…
Comment
It seems like ai should be built with the highest initial inputs being towards altruism. Even in that scenario outcomes could be catastrophic but the potential for incredible positive results could also be potentially realized. If you have the ability to build a god it should be done in line with the highest ethical standards that can be applied. It really shouldn’t be done at all but think the cats out of the bag on that one.
youtube
AI Governance
2024-06-07T20:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugy7FhpXRCOevbLGoQ54AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyRK08ijyxj43Stl8F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwlEI-7nUquT3W7Gl94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwalsiOPM5oQdBZe5F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxJoOYSxRmJrtz3UOx4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyXntFmnc0JEipIU8N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxU3z6ApY7HlfOJymZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzzi6zgUIlmXZcRwjB4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz72opCi2I6pRyvuBl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxJSn3-E_xm8ehT79B4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"})