Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
A driverless truck??? How many people will they kill and keep quiet???? Trucks…
ytc_UgzdcW-yD…
G
So don't rent a car and a home because all of them ever used AI for writing.…
ytr_UgxHdOp4T…
G
My dad saw my ai chats.. I am still terrified bc uh, wally had some... INTERESTI…
ytc_Ugy5bftAq…
G
First up, I've never seen an artist who didn't have shitty art when they started…
ytc_UgzUwvb0z…
G
this is not an installation, is just using ai from the cloud and how much can we…
ytc_UgwbO-ZsS…
G
It's helping me currently. I've been doing open source coding for years, very sl…
rdc_oh3p773
G
@Coolskeleton2030 the datasets this models use are insanely big look up LAION 5b…
ytr_UgwiZqUCg…
G
Ironic, the guy who engineered AI to be sharp and mimic the human brain, is now …
ytc_Ugyu10YOd…
Comment
This Israeli is hyped up about this. Evil people for sure. He’s also full of it. You seriously think that in 3-5 yrs manual trade skill jobs will be taken over? No. He only mentioned, oh weird people would want humans to do certain jobs. Lol, no jew; that’s normal. Only jobs that will be taken fully over will be anything that has to do with the internet. This is being ushered in by governments who say “you will own nothing and be happy”. They forget we out number them and most people will not want AI. It’s going to be forced like it is now with the Dems in the US with their terrible self driving cars. This guy is laughable.
youtube
AI Governance
2025-10-28T00:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwCzDMAvewJi1eIoUR4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzpPg7yEOsSAYxiUip4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzsDASEP3z8cwszSaB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwYO3gKxFh6qq-9x9p4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwz_0ulS4zQRus_tIB4AaABAg","responsibility":"unclear","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwZG84jMhPNwFYUvc54AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxKvrlJzps1JZi3gIB4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxKbvz-KT-YhfEJ3194AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"mixed"},
{"id":"ytc_Ugw9LsDtxW9pMFB3tKJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwCoNq1-ls8he1f1Et4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]