Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You can really tell she's a mechanical engineer by her definition of deep learni…
ytc_UgysC0ZS0…
G
You know, it doesn't actually know how to code, right? It just has access to so…
rdc_jiheu04
G
@honeysupYoutube classes are as good as the material that is uploaded. There are…
ytr_UgzcsjGQK…
G
Either way I have little worry’s as should you
Art styles exist and it’s hard t…
ytr_Ugz9uy3BG…
G
I actually think open ai is probably right here. But i totally agree with the ju…
ytc_Ugy8uGPRq…
G
I have not enjoyed seeing those videos pop up on my feed, I never really underst…
ytc_UgzHfcmju…
G
"Ai would probably need humans to run power stations, for a little while" OK tim…
ytc_UgxplofvV…
G
@KeeperofStoies Typical,ignore other point and choose irrelevant topic,it is a…
ytr_UgxHXgO7p…
Comment
The whole premise of autopilot is conflicted -- the point of something driving for you is so you dont have to do it. Unless it can work with 100.00% certainty, theres no point in using it. There's no way everyone thats using it is 100% focused on watching the road just in case something happens. Over time you start getting comfortable with it, getting more distracted with other things, then the one time you take your eyes off the road, that could be it. Its bound to happen. At least for other car manufacturers, the lidar will act as an emergency fallback for those 1% of cases. Government needs to mandate a lidar system for all so called autonomous aids so that companies can't gamble with your safety
youtube
AI Harm Incident
2024-12-25T21:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugyzze_I3dyELNAMNfZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyAPAB2qH3hbKGCsL94AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzZNsn0jyFh6PIUS1d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzYUwsjav54WDZzkC14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxaMOd7Xcuw45OVHZV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxoZO6iSf-Tidqye8R4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzwaIyrvLsTp9k3t_p4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwAcUNaTMCNzdgpEOZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx_r0bYTcaU-Ht2N-J4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzaWf0129tbl0cDUE14AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}
]