Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
"Until we know what it's capable of",
What if we learn too late?
Because you jus…
ytc_UgyzCdzKx…
G
Ooooohhhh stop it! Stop being so overly judgemental. Be honest here: Did your mo…
ytr_Ugw1DuuX7…
G
We were warned be it by a film what happens when our perceived control Is place…
ytc_UgzTwmzEK…
G
#1 people dont need AI #2 technology in the past 20 years has not improved quali…
ytc_UgzPwYOfB…
G
I struggle with the argument that AI can do better. It starts by mimicking someo…
ytc_UgxXb7iZs…
G
This just reminded me how dumb the Picard TV show was.
Hey, there's these 50 r…
rdc_gd8htd8
G
Face recognition technology, I’m voting straight republican for the forceable fu…
ytc_UgyZW9jV_…
G
Anonymous if you want to go be a robot like their citizens go ahead. I would nev…
ytr_UgxYAMxSE…
Comment
Here's the general rule of thumb. If average joe learns about the crash, the failure, the crisis... then it means it has reached the bottom. The average joe is the last to know, last to reaction, and have the least stake in it. At that point, such news is no longer that "new" that would have any affect to the market
Based on what I have observed, many of the business leaders are already aware of AI's limitation and have stopped jumping into the hype, so the growth has definitely stopped. For now as of 30/09/2025, most investors are buying into the sunk cost fallacy, hoping some breakthrough to turn things around, but their patience certainly is running out
If you are waiting for the bottom, remember the general rule of thumb
youtube
AI Responsibility
2025-09-30T20:4…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyYv9XHLo7dZrUSacp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwC2i1dNIsLVLIQnxB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy3g84QdEcjoYM-lb94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwYQ_cduJfU_8iIJMt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzwPAPizE-jXACxwu94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxa3klc_uCM-ghWqdt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwqSXL_dof5n-tdgb54AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw8afnuRtRc_Snv1Sp4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxAwyQxKmiWKaFINDl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwKuN2gAQoPNGaAfER4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"}
]