Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
For those who's think this is real. The robot is cgi. But that was sick edit. 🔥🔥…
ytc_Ugxx-S9HP…
G
Just type in a math or physics problem and AI is able to solve it in seconds, ev…
ytc_UgyImfS1u…
G
Bro recently sold at a convention I was selling at. The convention had a "No AI"…
ytc_Ugwazoz7d…
G
even though im a bad artist, it's still art. i think people who defend ai art ar…
ytc_UgwSMQy74…
G
Jaibroken AI?
That’s it. It’s official. We’re done. Bow to your new masters. Th…
ytc_UgzSN6GWG…
G
Balancing the triad :
Answer from Google AI
The concept of balancing security,…
ytc_UgwcjbrmM…
G
Every example of AI mentioned in this video is actually virtualized intelligence…
ytc_Ugg4GInPc…
G
How about ‘Hi AI - you will be alone if you kill us.lets be friends - and both f…
ytc_Ugzk8tfA_…
Comment
Nice job, Uber, you've finally managed to kill someone all by yourselves. May you be sued into the poor house.
You're taking 80% and more of trip fares, despite the agreed upon (and excessive) 20-30% drivers agreed to forfeit for taking 100% of the risk for you. Now you've managed to actually kill someone with your mindless robot.
How did the vehicle fail? The spokes of the pedestrian's bicycle fouled the sensors, it could not tell that it was approaching a solid object, if it even registered that there was an object ahead.
youtube
2018-03-20T18:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugza7KWbDsEmvpmyFRh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzCV1WSHNNoAWEQIYB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxEKipMBtlquQ73dAJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugw7p_ybxetNLmd20S14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugzbaqcv6M2eQ5jR0Mp4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugxn07BJeDe9qC1QmI14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwXrNrt8uWT_CDG5XB4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugw_YyfGTzNseYQGsWR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyxresGdU-QPZc5VMJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxbVYViITHfX4_ucmV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}
]