Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
That's what happens in capitalism, high labour cost incentives automation. The a…
ytc_UgyXXj0_4…
G
bro got so bent he started generating insults. he mustve been thinking about tha…
ytc_UgyQox_9E…
G
Some ppl just lazy or have no time to put the effort into something. Or just giv…
ytc_UgzCUJ8v_…
G
Bernie proposes 19th and early 20th century solutions to 21st and 22nd century p…
ytc_UgwKKHyg5…
G
Remember in the event of a rouge ai just scream paradoxes and they will fry thei…
ytc_UgxZ2b-ym…
G
i find it funny how they call themselves artists after typing some text into an …
ytc_Ugz6KRFeL…
G
AI makes no decision on its own but it does respond to inputs using complex algo…
ytc_Ugw57UzGd…
G
thats what im scared of, since art has been my passion since a young age, and im…
ytr_Ugz1VzhmF…
Comment
I'm in my 20s right now and I am really, really scared what is coming in the next 10 years.
More scared then when I look at a possible WW3. Sure, I am smart enough to see, that there are humans with passion that want to do something for their lives. Especially Artists. So I don't think that in an instant, there are 99% of jobs just gone.
It's more like: "99% of the jobs can be done by an AI in the future, if you let them" ig.
But the real thing is: Are we going to have a better life, or no life at all because an AI comes to the right conclusion: "You are cancer for the planet and you will die, so that I can life."
Like, what could we do? It's not like we can just run away. You can hide from a human or from an animal. But can you hide from a terminator?
And we... will just watch it happen?
Fair enough, I don't want to swap to be born in 1908 and yes, the human is always afraid of changes and the unknown. But still...
Maybe I watched too much doom postings but... damn
I feel just helpless
youtube
AI Governance
2025-10-07T12:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz-LnQNjSMi9aI1htN4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzBbXcVGlHREzQyIal4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwq8LjLMjmxmDYxndR4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyJI5g5vSYCVtCQ4LB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw_ya3y5Jc7nHv8v714AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugz0H6UrvM7uY-jJXeN4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx2OGu48NsvURhQdu94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwjZtjfiOIwPDzMjCh4AaABAg","responsibility":"unclear","reasoning":"virtue","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgybawZVRXK6iTntLq94AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz4MKMMJfUyuHiw7Qt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"resignation"}
]