Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
"Who should be making the decisions anyhow? Programmers (software developers), …
ytc_Ugg5W6Ybw…
G
@pooronkashyykBy definiton..
Slop is rapidly made goods and services which lac…
ytr_Ugzc2F8An…
G
The problem is not that they will feel them, the problem is how we're going to r…
ytc_Ugzp-_ZKB…
G
A.I can learn way more faster than humans,and eventually like sir Stephen Hawkin…
ytc_UgwGSpwYI…
G
Same in STEM cell, try to use ChatGPT to confront my hypothesis, doesn't work it…
ytc_UgzkBQpdf…
G
If you are reading this then you and everyone on earth 🌍 has merged with AI. I a…
ytc_UgwHppbe1…
G
All these comments from 9-10 months ago making fun of this guy and now AI makes …
ytc_UgweFOoQE…
G
No, people in general do not care as the mainstream art is littered with garbage…
ytr_UgwgTOaVC…
Comment
Huh...he's wrong about scifi. Australian author Joel Shepherd wrote a fun space opera maybe 10 years ago that just happens to make a (now) plausible prediction about what AI super intelligence might look like and do. Developing their own "religious" beliefs is amongst them. Also that its unlikely for such intelligence to have unified, single goals or behaviours, or that we could possibly ever understand or predict any of them.
youtube
AI Governance
2026-02-18T09:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxkeNDMXtbPijAttV94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwTPoh-QPj3qPPkkyx4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz1UmXow4W-2336UO14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzbdvGDZ67AohYivOd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxH1Ba9ufiwtUJkPPF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyzVbxuWfYt_2-wmPd4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzyQrLssUQmiXhJhWl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyGqulGF4_qvNvD1d54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxITGIhQwHBneNT2Cp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgypAAVXd0Tm7mwtt1d4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"resignation"}
]