Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
always adapt. we implemented AI/ automation. it has taken over the mundane tasks…
ytc_Ugz7zRr76…
G
In 18 months, when AI fails to be anything more than an occasional help in some …
ytc_Ugxf0t6zS…
G
It's not that AI will take your job but who says it's ok for AI to take it…
ytc_Ugy6bev-r…
G
The name of "Artificial Intelligence" will need to change. We all know that. I…
ytc_Ugzmbxrfq…
G
Hey there! In case you're interested in more advanced discussions with AI models…
ytr_UgwkIRGcm…
G
If you find the truth with artificial intelligence, know that this truth is foun…
ytc_UgwDEocAu…
G
you mean that ass hole that won a Pokemon art contest using ai? Or did some othe…
ytr_UgydQTl5h…
G
So I feel like it'd be helpful to share my experiences here. I have dyslexia and…
ytc_UgzOzZcv7…
Comment
I don't even see the purpose of self driving taxies if u still need a driver to watch or make corrections, for that, just leave the human to drive. It's gotta b pretty boring to babysit a car for 8 hrs and expect the driver to do nothing else but watch the road. It's almost a set up on the driver to get board and distracted! Makes no sense. But I do get the technology has to start somewhere and needs to be in situations and used and monitored to work out the kinks and expose the pros and cons not already foreseen.
youtube
AI Harm Incident
2018-04-07T20:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzZSnnfj59UOcWUNUd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzMUqIxyPPq82ZPwH14AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgznW18G3AMIE_uCDFB4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwtM4pOivcujKaP98B4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgxXlsYlR39y7k7YXLl4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz0LPFwCBLHB8OSH4d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxKr-UyshtI6A1hDOd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxKQZ3qrwjF4MfFJGl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzBsJ6NOsaDQOsdEPx4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugw6xrdTx1uwcPo-yZ14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"}
]