Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I wish these AI ‘Artists’ would actually do something instead of calling themsel…
ytc_UgzXFdQqL…
G
Exactly
I always use it for the boilerplate, but I strip off 80% of the code fr…
ytr_UgxXzIHMW…
G
I've had this exact conversation with Gemini in regard to it constantly apologiz…
ytc_UgzVcOMbf…
G
AI some-what annoyed some part and cold to the interaction. Also Its a bit soun…
ytc_UgzptFn0s…
G
If we make AI Deep Fake P videos of everybody on the planet, then nobody can be …
ytc_UgyW5klOE…
G
Why even chance it and allow it to continue further? It’s a tool but its likelih…
ytc_UgxRoPklq…
G
If robots and AI take all our jobs, governments would start wars so we kill each…
ytc_UgxcOO5E_…
G
AI will break the wealth generation part of the cycle. Wealth begins when you s…
ytc_UgxpJNJsM…
Comment
If a self driving car causes a crash. Who's responsible? Who's liable? Who is the party who will be required to buy insurance for that liability....
The answer is it doesn't really matter. Because it's not the human passenger operator. Which means its going to be some else besides the person buying the car. Ie. Some company somewhere is going to make less money than they could make. Which means this is never going to take off.
youtube
2026-02-15T02:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugxx2dDcsXHjg1vbYNV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyoB5bIT2sHwEAq6GB4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwBHO2diRt1RWkNu-p4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzmpFnpvXOiMhq20WR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz30kq62uKf8fNI9VZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugyr09Sjf19iJJG8pa54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugyemg6m1_wlFXTEOxV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw1ii8uXK_ple6hYcx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwwJGBS6H_CEiNuHI14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzjbTDyPAG1UVOhKNZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"mixed"}
]