Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@salemsscience7274 😂 that's not how it works . If you don't code something in …
ytr_UgyXJ7SJl…
G
The dude with the beard is the GOAT of AI safety. Worth paying attention, maybe …
ytr_Ugz28NXuP…
G
It's enraging and I was ready to comment about AI...but why?! Why can't any of y…
ytc_Ugw4Jedkt…
G
AI is an ancient GOD that greedy foolish humans think they created and can contr…
ytc_UgznZ-yPY…
G
IMO, it's already automated enough to kick out all the people working in the fie…
ytr_UgygsE2rZ…
G
In the future u will have to rent a robot to go work so u don't and your going t…
ytc_UgzWIad-s…
G
A very similar situation began unfolding in Australia from around 2024.
A very h…
ytc_UgyKSONbT…
G
What do you call someone very intelligent, with no conscience, and limited to no…
ytc_UgxC3MegP…
Comment
I think Tesla's biggest problem here is having marketed their product as auto-drive. It would have been better for them had they stayed with something like co-pilot or wingman. Even state of the art jumbo jets which can take off and land themselves still require a human pilot and copilot.
youtube
AI Harm Incident
2025-10-06T03:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwuQRD2kupySFWdpTh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyfovMhdm3eLeQV9G14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzXhs9F4VXJJur7V4d4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugxm7nng80gPtO_qWEx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgznIuALORolMMREhxh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzRAWB1j9sIV36PwAB4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw4a-Xy9TqWSUF5xWB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwnoKYJtEYMF6GUwbF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwvKXyJ3tTKFOnqNpd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw1gx3jQu_InJLPGqp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]