Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I interact with AI a lot. It's basically good. Humans can be good and bad too.…
ytc_Ugy8QBGWa…
G
Let the Ai loosers have their own conventions, it would be a funny experience. E…
ytc_Ugxtuz7PO…
G
Here is a thought experiment. In Isaac Asimov's Foundation Trilogy, there was a…
ytc_UgyJ9ceWe…
G
I no longer agree with using it as reference, ai does not understand what goes w…
ytr_UgxfvVJW5…
G
@isitanos yeah I didn't read into it too much because I thought it could be some…
ytr_Ugw7Mo0sg…
G
We can also look at it this ways, the brain named itself, saw its own flaws and …
ytc_UgygV8AkI…
G
Oh hey, you are using a comparison that doesn't work in the slightest. Cars work…
ytr_UgzQY-i6A…
G
Would you say AI moves from one data point to the other and pieces together an a…
ytr_Ugz8_43XS…
Comment
This also means that social security will get completely blown out too relatively soon after many millions of jobs are lost and that system will need to be replaced with some alternative also. // As far as safety we hope for the best. What happens when this superintelligence leaps so far ahead of humans that it considers us no more than we think of insects? Meanwhile, we should be as concerned about some of the humans controlling AI in the near future before that superintelligence develops into a singularity.
youtube
AI Governance
2025-09-08T09:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxPfh6dA8OeHyu_UYJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyqZS5ouOvY0_owWVt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwRkur7E91K8TZ28aV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwqjt3YGl_SNqb6SkJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgykL2t20ZKbUZFAPJ54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzC7265l1qtwvaY0Bd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwZuyfg7qypTToZVdN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxCYvKeZbt6MFeyDRF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxpLVdNGwNCj9WeA5J4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgxvXCGmiuqqFAsdygR4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"approval"}
]