Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@LucasCulpepper This is a great video that you used in your playlist. It provide…
ytc_UgwX977hr…
G
Considering that artists claim art is subjective, then there is no argument to s…
ytc_UgwJYgOT1…
G
Unfortunately I believe the one place AI stands a reasonable chance of being pro…
ytc_Ugw1zB547…
G
HUMANS WILL NOT BE ABLE TO CONTROL THE ROBOTS.
HUMANS ARE THE EVIL .
ROBOTS DON…
ytc_UgypGDaUC…
G
I wonder if these men started getting deep faked back like the president start t…
ytc_UgxNxb94p…
G
Another possibility I see, Without being an AI scientist, Is AI will become so s…
ytc_UgyGeC6X3…
G
Artificial intelligence has algorithms that let robots create their own new idea…
ytc_Ugyd8UDRE…
G
robot dont have the ability to walk clean and smooth yet they have to be people…
ytc_UgxM36671…
Comment
I agree with the Savagegeese channel. The AI car problem is not going to be solved. The only way to solve it is to not need cars.
However, assuming we are happy with 'autopilot' with limitations, clearly, if safety were a true consideration, you would need to be certified to use 'autopilot' in a car and that certification would be model/software specific and don't independently as an addon to your licence. Much like how a pilot is trained to use automation in the air.
This would make people aware of the limitation of their system, teach them who is responsible and encourage more cross-industry collaboration so that similar systems are installed across brands reducing the need to retraining when buying a new car and increasing the shared development effort to make the systems has good and as robust as possible.
youtube
AI Harm Incident
2022-09-05T12:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxS8KKt4TY-MSVvq114AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgzpjKvL0Y86Xls04PJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwYOeoF3eIEGIFMj4N4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugyw2rgGurpRMrr2RrB4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyq8RQePuS9y7Da_CN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxCuHwrbxN07duSTMd4AaABAg","responsibility":"company","reasoning":"mixed","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugx0P1V1ZSk1YHuCHY94AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugxk4sPBoKlYw8MDDD54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugys_oOAhB9U_o7gal54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwXyz7GuoBlpC5liGN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}
]