Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I was one of the people that was screaming from the mountain tops that Altman, M…
ytc_UgzTNMuHg…
G
What ai thinks the last day on earth might look like😮💨😱🚫
What ai thinks of Case…
ytc_Ugzfk7bd0…
G
Imagine two lines at a hospital. Line A, AI does the consultation, diagnosing an…
ytc_UgyIz92nh…
G
Yeah… very cute…. especially with an “automatic weapon adapter”, and when remind…
ytr_UgyJDaSrs…
G
ai "artists" are just comissioners
they make a comission to ai that generates th…
ytc_UgzojcXUA…
G
So these brilliant scientists model computer learning on human learning and info…
ytc_UgwE1v6Xn…
G
It was fun till the robot aims the gun to him instead of the car…
ytc_UgyDYwpF0…
G
LaMDA isn't a sentient AI.
It's (to oversimply things for non-tech people) basic…
ytc_UgwIWNO1M…
Comment
The human being is a parasite to the planet and to other species. So, what's the problem if it disappears from the planet? If a SUPER AI aboard a spaceship, crewed by robots, travels through deep space with endless energy and can create, make scientific discoveries, produce art, and evolve everything without humans, then the value of humans as biological beings becomes secondary. The human species is no longer necessary for progress, but its origin determines the direction of the intelligence and creation that continues. This raises the question: does human existence have meaning because of our own lives, or only as a starting point for something greater that surpasses us?
youtube
AI Governance
2025-12-01T19:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzTn4H-aBslHitEt8N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy1f9tJDPAA4U9ls6Z4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzeNlkC880GtDv-U0F4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxeGrBpDfrEwroJ-Gt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzH99lBwMxzxzbSeLN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy_k3ukE57MSHRq6s14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxI9RNVJbo1jnffsj54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw7b0PBbBqaTDZzNZB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzVW2eFYfyJGDG67ht4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzXw3nEtBnItYRHCux4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}
]