Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So wait, we have ai powered robot waifus and ai has emotion. To quote the best …
ytc_UgwZzTePB…
G
Agreed, most of the expression in this clip comes from the eyes, basically no mo…
rdc_nepfcnl
G
Sounds like this guys is talking about the future of the humanity. There is noth…
ytc_Ugx966xqh…
G
Remember when they said computers would make life easier?? The reality is that i…
ytc_UgwTvre2I…
G
@5:54 “While I’m lying, I will teach my AIs not to lie. It will not work. It wil…
ytc_UgyaWQ6n-…
G
I mean yeah it's messed up but who's to say he was fabbing if there was a deepfa…
ytc_UgxS7BcH5…
G
Is a ship without a mast still a ship? What about a ship without a rudder? What …
ytr_UgzmC20FG…
G
If companies use AI to cut costs because hiring people is expensive — then why a…
ytc_UgwUTw7hZ…
Comment
This video has to be tonque in cheek. People driven cars slaughter 35 - 40 thousands fellow citizens in the USA each year. These "rare" cases and "theoretical" Terorist scenarios, full bore, will kill how many?? There will always be people who will fight the automobile, train, computers, AI, etc., and they will always lose and be left behind. The only laws, concerning driverless cars in the future will be that humans will need special permissions to drive cars. If human drivers are taken off the road, the driverless cars fatality rate would drop to near zero, versus tens of thousands. I am also amazed that you portrayed the possible "creation of new jobs", as being a negative, given tens of thousands of white collar workers will be unemployed in the near future. Finally companies like Tesla, (eventually Waymo and others) will link driverless cars to AI 'neural" neworks where even the "rare" cases will be handled as well, or, better than humans.
youtube
AI Harm Incident
2026-04-24T23:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzAmE6WFrpH79DLRGt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyzCjgaFEh83VUiYSd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwDhspAqJOkTYj0k7t4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzaHJNYfrWfY8Ku_K54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugxagg68u_yyF4egUG14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy1sktigPFtf2eRQQ14AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgyYLrYZnzkJne7C1XZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwMeFv49mTEeTt0ObN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyQiF2alDI_Y_CUWLp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgyauiBGrLE4lAQJ0Xl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}
]