Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If anyone should go on strike, it's the people behind the production process of …
ytc_UgzwPBh1e…
G
> Ugh. Personally I don't think anything we are working on has even the sligh…
rdc_l5ukbbq
G
I went to school with now CEOs and senators.
Can I be invited to talk about stu…
ytc_Ugx11HRVX…
G
As a Clone Wars kid the initial memes were funny as hell. The art of subtlety is…
ytc_UgxG8ib8X…
G
@EchoStastusAnd again you being an asshole. You and all Ai defenders are just t…
ytr_UgxEBqLg3…
G
It's not clever and it means nothing. Write about X without using the word X... …
ytr_Ugw6601iN…
G
This A.I crap feels like the flaming Moe episode at the end ..Home seeing all th…
ytc_Ugy8lJFTQ…
G
This gentleman needs to review the Wiki on the Dunning–Kruger effect.
It seems …
ytc_UgwRv27Br…
Comment
The income of the future should be based on your health. This is a Don't Die concept that Bryan Johnson doesn't truly speak on frequently. That we program A.I. (so yes, we pause it and program it appropriately, or guide spiritually whichever...I will digress here because this would take too long,) to focus on making sure the planet and all of it's innards "Don't die," and this also becomes our income. This will most likely go wrong in many different ways, BUT it will take SO much longer to end in total destruction. It will at LEAST give us time to evolve properly, appropriately, and then GTFO. - My thoughts are my OWN y'all... or whatever.
youtube
AI Governance
2025-12-04T16:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyYYLacM0YRJHRXXe54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwSvM64Yp2FM_0zRHF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwKqfrV16YItYINF_l4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwGIzChvluB3KdjLI14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxECJw8Eem0RNQOV9d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzp0GOiiZaJmCgNpOl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugyd2brReOaKLgZBrxN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx6d3Ih_7GYFiZTq2Z4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzV2wY2YZMkVFXZHgt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyOnzx7t5JwmBiDvq14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]