Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The scariest thing about the AI is the threshold for AI to fail and harm or kill…
ytc_Ugw5P2BsR…
G
The true gold of AI is for a very few to manage and control (enslave) American …
ytc_Ugw9A8kAS…
G
I think the people that stand to gain the most want less people around that they…
ytc_UgwEdjpPb…
G
Yes it is useful, depend at which angle you are looking at. Wrong Angle means t…
ytc_UgyGRDMVK…
G
Fucking finally. It's not much, but it's something.
AI reorganizing the workfo…
rdc_jn50993
G
The real reason its not gunna successfully help you, is the fact that you want a…
ytc_UgzKCG8ae…
G
There needs to be safeguards and or power source disconnects in case A.I. goes r…
ytc_UgxWQwok2…
G
Find someone more skeptical of AI’s abilities. I’d like to see what they have to…
ytc_UgyZ5ZdR3…
Comment
@thefunseeker9545 Your final point in parenthesis is meant to be an aside, but it kind of just reinforces my point. Evolution IS just a process. It is not intelligent. It doesn't even fall into the same category of things that can or cannot have intelligence. It does not have a will, or a goal, or intent. Thererore, saying that humans are smarter than evolution is absurd. It's a non-statement. The two things aren't comparable.
One things humans seem to excell at is attributing meaning and intent to inanimate things and concepts. It's the reason we have religion and conspiracy theories and superstition. It's also the reason we have art, and why we see shapes in the clouds and constellations in the stars. The same is happening with AI. We WANT to think it is more intelligent than it is. We WANT there to be a hidden intent behind the random bad advice it gives. We WANT this machine we created to be smarter than us so that it can solve the problems we created, when in all likelihood, it's just another problem on top of the pile.
youtube
AI Moral Status
2025-11-09T15:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_Ugw6qr1qi3ZgbVQPVxB4AaABAg.APG_a3AE73YAPIYRsisc7r","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgyYovf1Kh45Bje4_QB4AaABAg.APGUqBkHhucAPWXvzC71aq","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgwmHgmTqeYmeUJMETd4AaABAg.APG6FdDC6vlAPI2Rttb0hO","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytr_UgxUWcrf7s1vGuSCuuJ4AaABAg.APFwIJYpq_vAPHy-KfuPM0","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgxUWcrf7s1vGuSCuuJ4AaABAg.APFwIJYpq_vAPI-4WfP1fJ","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgwQMkurrBx9sp3_7o54AaABAg.APFct10SB_cAPHyDZZUAqQ","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugx8RZB2H1FIrghQu4d4AaABAg.APFRfAKUE3EAPJIGxGSSP3","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgxHFFQON6Um18Fzvap4AaABAg.APFLSHSuE_1AQNWw3A0_ka","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgyB27wG8OxSx0T0P7d4AaABAg.APFBDI5D5g5APK7z3u_gOd","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytr_UgyEiE-N1Aot_-H0m8Z4AaABAg.APEUeffS1oVAPHzR8fguN1","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}
]