Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
"Because the people who code wouldnt teach a learning machine how to code"
ChatG…
ytr_UgzQEqOak…
G
I've got a better one for you hey AI CEOs can't be assassinated there you go…
ytc_UgxeYgmKF…
G
NGL , we should js bring back witch hunts but instead of witches we go after ai …
ytc_Ugw1Pgekr…
G
“According to dude-whose-job-is-super-at-risk-right-now, their team is doing gre…
rdc_jpm4qsw
G
14:27 im mentally disabled (diagnosed with ASD for 7 years) fuck AI "art." not a…
ytc_UgxQzF6RQ…
G
@approachingetterath9959 I wish we did not have to, but sometimes, more often, C…
ytr_UgzugsmIe…
G
Our generation born without PC and now we live with AI, stop whining.
Oh, and on…
ytc_UgyBkryuB…
G
As a pro AI guy. I got to say that I agree with of what is being said, The Pica…
ytc_Ugwheprfh…
Comment
21:00 the only thing I have to vocally disagree with here is this phrase "AGI is not rooted in scientific evidence".
CAN AGI be achieved WITH this tech? Probably, probably not. Its definitely not proved scientifically and I'm willing to bet we need more advances for sure.
CAN AGI be achieved at all? Yes, clearly. The only example needed is the fact that you and I are alive right now and exist as AGIs in the world.
I can't tell you WHEN we will get AGI on computers but if you keep moving forward you will definitely reach it. To say that maybe it's not possible feels like a crazy statement to me.
There's nothing special about human beings that couldn't be replicated one way or another by human progress.
youtube
Cross-Cultural
2025-07-02T05:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgyDjHn_exXwDKwBqhh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugyhgy6BGvDP8QX_tkl4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgxffXZFL19fIHdU0o54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgwDCzs_qjPHgMA-JDp4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzG0ralqUWNzWjweO14AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxx9wcBGiqdjKHkbRx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx-OY38oCtKSqznqJF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxUMfi5CvHYVRNqsvd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyeSInA4tmLt91ud6J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyOM8xTeYvakyFl_Ed4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}
]