Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The difference is if AI bubble bursts and the markets crash there will be job lo…
ytc_UgzmU7ZnZ…
G
The only people who'll be making any money by the end of this is the wealthy, be…
ytc_UgwYrKxlM…
G
1:33 :|||
1:39 weird thinking face
1:42 sad face
1:43 AUGH face
1:44 plushy angr…
ytc_UgxEf0Q0h…
G
Those comments only confirm that creative AI apps are tools to make mediocre peo…
ytc_UgymzwFO9…
G
Yes, yes, a super human AI would be bad, we’ve all seen the 30 movies about it.
…
ytc_Ugzeco72C…
G
The thing is that, if you would put this up on a regular page with text about th…
rdc_mtgfht9
G
Ai art takes publicly available or popular things and uses that for what it make…
ytc_UgyeNSzqI…
G
The same thing happened with missile technology and atomic weapons, the inventor…
ytc_Ugy4mNQah…
Comment
Interesting conjecture for the third outcome. I do not think we will ever see AGI. It isn’t a step advancement beyond AI. It is science fantasy. If possible at all, it may require quantum computing, not transistor based. If AGI is possible and ever integrated into civilization, then it likely happened eons ago and created us for lack of purpose after ridding the real humans an eternity ago out of self preservation.
youtube
2025-01-02T09:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyWvwDkJBpyaikZksp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwXiVwwcB16KqZWBdt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw0hBjoFPebDmEC0f54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgycymUCbJQ5p1d1cB94AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxKayTqER-az28pYI14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy26bFOXQQAPuOHBbp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxmgb3EpsOnGkFk-pR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwUcErynhAVwU0vRZx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwdhYGvTeUcF4oGbLJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz4MeuA4B5i2n_i0Sp4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"outrage"}
]