Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The public has not been given the information needed to understand the benefits …
ytc_UgxONUEVg…
G
>Or crash their self-driving car.
Hacking, the reason I will never get a car…
rdc_dvt2413
G
The radar system is unreliable cus it isnt AI.
Even with Teslas with radar, they…
ytr_UgydaZCpS…
G
For AI to become scary, money will need to be obsolete. If no one has money, the…
ytc_Ugzj83KC9…
G
With all the power of AI, humans are insanely complex! 10 years and we are now …
ytr_UgxIvh3pw…
G
In situations where staff are using mouse jigglers to fake productivity, I think…
ytc_UgwAFtbBp…
G
A person in psychosis, or the one who has a higher risk of getting psychosis(tha…
ytr_UgwhPGG2R…
G
This program - written by AI - is designed to lull humans into a false sense of …
ytc_UgyQbYz6p…
Comment
it's sort of the bottom line. AI did not cause this. People caused it, we built it and we made it thrive because of personal benefit. If people in charge cared for others, they'd stop now. Instead they're happy to continue, reap the profits and let the poor figure it out among themselves. Just as he did.
youtube
AI Governance
2025-07-02T08:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgyCLBEJAZHeRl4oaYV4AaABAg.AK-wckndrYCAK3sdOPNRza","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytr_UgzMwA-2As4laPuRoVp4AaABAg.AJz3mGo7qjuAK598EIrC3o","responsibility":"distributed","reasoning":"contractualist","policy":"liability","emotion":"fear"},
{"id":"ytr_UgyENOH579RxGauK-Fd4AaABAg.AJxQi3GpjsJAJyNJyA82pX","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytr_UgxGcR7_R7Qvr8MeJN54AaABAg.AJxDAzLcnKoAK53ZbbSD8N","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzQ4G3PwmHBfIFbOoV4AaABAg.AJwsdfgS75tAJwtX_TBG5C","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgwRzp1RMgsGaXXIHYR4AaABAg.AJwRrAGjn4UAK4EOuziUyG","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgxZT6LuDvkTW0mRrNt4AaABAg.AJwL-E2cvILAK4FBe6qO1x","responsibility":"distributed","reasoning":"consequentialist","policy":"industry_self","emotion":"fear"},
{"id":"ytr_Ugz8kXJ54R54LGC4pDl4AaABAg.AJw8sFdpsn3AK4L8pyijZP","responsibility":"distributed","reasoning":"consequentialist","policy":"industry_self","emotion":"fear"},
{"id":"ytr_UgxuKqS5SR4qTNHORi94AaABAg.AJuhcfLiaW9AJv0uOzmZHv","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgyCLi-a9qrVIhXsxQB4AaABAg.AJu8hia4-F-AJv5cYcGsPA","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"}
]