Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I have to disagree with Neil's horse n buggy to automobile analogy. It's a false…
ytc_UgwbZbNat…
G
I'll just leave this here:
What does knowingly mean
ChatGPT
The term "knowingly…
ytc_Ugz9GBxMu…
G
Then explain to the public why are we still 50 years away from true ai…
ytc_UgyE5Nn02…
G
When i enjoy art i enjoy the effort that was put into it. If i know that this pi…
ytc_UgzrFP-5E…
G
computer scientist here. i agree with you 100% - who honestly wants to live in a…
ytc_UgwofygIn…
G
A FANTASTIC (they all are) video about "Degrees of separation" shows that pretty…
ytc_Ugwf4XfrP…
G
That's a valid point, but they now view this as a vehicle for spirituality and r…
ytr_UgyrOXCwG…
G
Am I right in saying that his version of ai is like the "hoverboards" that came …
ytc_Ugwb-uQGD…
Comment
Relax. There is zero evidence that Artificial General Intelligence (which is a step on the way to these hypothetical artificial superintelligences) will be achieved anytime soon or in our lifetime, or even that the current advances with large language models etc. are even a step in that direction. They look impressive from the outside, but they may be complete technological dead ends with regards to AGI. These predictions are utter nonsense.
youtube
2025-09-05T23:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxI8v3mvweRKfCyzpl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwemgYUWOIjAxROOcR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw5CL4kFI7NTureCM14AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwOo0EIn-8UMSIW9BB4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwkzIbPpkXO8cijaVV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyYK4t6Bj3IwOR-58Z4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwdzSe5Mj7R5_2Zs1V4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyYa9Mzf4XlEBFXINJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxZnU19qSo_0MACY7h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxvztazTRPotoa2pOZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]