Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The sooner corporate America learns that the only thing AI should replace is the…
ytc_UgyFZX0ED…
G
It was a brilliant interview.
Unfortunately, she missed the last question about …
ytc_UgzEVvVAR…
G
How much $$ are ypu about to lose to Mr Altmana and crew for betting against the…
ytc_UgyvVz6xt…
G
WW3 is either countries vs countries or countries vs uncontrollable ai bots with…
ytc_UgzqWb0ee…
G
We not maxed ai
Corporations is to greed to wait
So it more like greeted out ai…
ytr_UgwKoary6…
G
Completely agree. It's so wild that Tesla Autopilot apparently stays engaged whi…
ytr_UgzDicT5c…
G
Redacted_Theorist ""Guys, I'm a chef because I called into a McDonalds and asked…
ytr_UgzRNMoZb…
G
A very fun fact. Between artists protecting their art, AI generative image poiso…
ytc_Ugxl8Whqn…
Comment
I think it is too late already. Ai already exists. And all humans combined are unable to stop it. We are like ants to it. Or worms. Best we can hope for is a kind AI which will see a point to human existence in one form or another. If I was able to build projects, hire people and set up nuclear power plant infrastructure from home in front of my PC, AI can do anything. If I was able to build a house just by placing phone calls and wiring money AI van do anything.
youtube
AI Responsibility
2025-07-25T06:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwiBt1k3984XHtt0b14AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxnQvUBEMHfVosxEPl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugxuv9VWiyxMF0hac9h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxNcaA8cPFEOLCkEGF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz6x_ZaT01T4jCexU54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyilDyU3K0ZPjAaMpB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugzu1kVMbA7okox2SPp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxW9DjWKJIIO6qPLEt4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgxR9rAEaWVL9KyDFct4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzmVTeVKvtVVHIGmmp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]