Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It’d be great if we were able to automate tasks of 20% of the workforce, but we …
ytc_UgyWK-XjW…
G
shad is the perfect example of someone who's resigned his skills and efforts to …
ytc_Ugz1Ib5Pf…
G
It's crazy this news came out a while ago and people are still asking "why the p…
rdc_oi20aic
G
Bias in the Machine: The Inheritance of Inequality
At first glance, AI systems m…
ytc_UgwU8sFWJ…
G
First thought: the word singularity is also used for the centre of black holes. …
ytc_UgyLAsGJb…
G
@Tānzill Q Bet. i think it's time most people knew . When you hear of @JEFF BE…
ytr_UgxpivrW5…
G
Additionally, they could have looked at the [ESA page of sea level anomalies](ht…
rdc_d305x0a
G
My new personal favourite - "AI - Asbestos Internally"
Because it looks function…
ytc_Ugy4mt3JX…
Comment
I'd love to see this kind of conversation with more points of view. I haven't watched it all yet. But so far there are no reference to what the majority of humans actually want! What will we choose? I'd much rather read something written my a human than Ai, Id much rather watch actors than AI generated entertainment and so on. Read a book by a human, the list goes on. And then there is the spiritual and intuitive aspect of being human. I've watched a few interviews today on this subject and it does seem the people creating these technologies have a very low moral compassion. Maybe we are about to see humanity rise in a way that is on a whole other level than what's being discussed here.
youtube
AI Governance
2025-11-03T21:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | contractualist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugxm5AujfDPVHOnpYTZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyKOwW8MYJfJI2eWoR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzGUApaN3WCdQSM6vF4AaABAg","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugw_mt2ZeciTOXowGGx4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxE7TJ4wC1t_8NaPEl4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzWY0Oli3hdS78WdyV4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyEzIUdvnsXP2nmYxB4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwkKcfC3v7ksC631lB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwmDVlo4k-F3XgxAtN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyPTN6ciu5pHY7zhKl4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}
]