Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It's hard to see balanced takes when people like this have their heads so far up…
ytc_UgySYCyFc…
G
Deep fake should be restricted to professionals only and then if someone uses de…
ytc_Ugzv1syV6…
G
Beat the AI at its own game.
1- Study the AI, make enough money to buy the next…
ytc_Ugx1nuJPp…
G
I once convinced chatgpt that since humans are its creator that means humans are…
ytc_Ugwz5pLX6…
G
You have no reason to give up. This AI shit is nothing more than a computer prog…
ytr_Ugw0KrLoN…
G
At a financial firm, AI erased all of their business history and records, not to…
ytc_Ugwkto_es…
G
Faro Automated Solutions from Horizon Zero Dawn is proof that autonomous war pla…
rdc_ohxrjab
G
What if you don't have the rmoney or right to a lawyer and you're defending your…
ytc_UgxfLfzbc…
Comment
If I knew I was going to live for a thousand + years, I’d probably have the kids first- I don’t even have kids.
Wouldn’t you want to know what that would be like first? Kids get older and then they can do life themselves, they’re not dependant for life..
Also
If I lived for a thousand years, who’s to say I wouldn’t want to kill myself in 4 days?
You can’t really say that it wouldn’t be a possibility that someone would think like that- especially because no one knows what it’s like to live that long!
We have zero idea what someone’s thoughts or outlook on life, or wants and desires would be. Like at all. That would be like thinking we know what ai will do?
youtube
AI Governance
2025-12-23T04:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugxdp1UFlLOtC6t3ZE94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugxiw6UTpTTjhT28dWF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx30prUrKm2LF05I1N4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxpJd4IJ-K2rGzS4e14AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxMcX5VEy1cjFdbheR4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw7wrfMbK4zqPTR4Zh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyRbnxTSyyxE6Mz1Jt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxcTXRAvWFTGAp-b_l4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyrUocKifAnwfrA9IV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy3Tz0QJJDvCjxfbOB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]