Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I mean the AI wasn’t wrong… he was involved in a shooting. Just wasn’t what the …
ytc_UgzIVacxi…
G
These guys are so unimaginative when it comes to what the dark side of AI will d…
ytc_UgwHo5W3t…
G
I want to see how a phone ChatGPT deals with a virtual assistant chatbot that ne…
ytc_UgzbAsohu…
G
The religious ai lost specifically because morality is man made. Presupposing an…
ytr_UgyuLdhrO…
G
All the AI dormers never grapple that AI will still rely on physical machine tha…
ytc_UgwaxQOve…
G
Hey @mohdomartokyan7003, thanks for your concern! But have you ever considered t…
ytr_Ugw-wtZ34…
G
If AI is meant to be more intelligent and smarter than us as human beings , then…
ytc_UgxrvXwQ3…
G
I ACCIDENTLY BROKE THE FILTER AND THE AI SAID
"do you wanna...have yk..s-s you …
ytc_Ugy7ALPtd…
Comment
I know this has been mentioned before, but this is a dangerous road... for humans. We're silly creatures that sometimes want to push our limits with strange things, and in the case of this road, it's how fast someone can get through it without crashing. You're car on the other hand treated it like a suburban road, barely breaking 30 mph on a two lane road that's well maintained, has no breaks, and no pedestrians. This is something a lab would love to show of their "full self driving" to gullible people, because this is something that looks hard, but is extremely easy for even simple AI. It just has to follow the road that is on the map at the set speed limit and follow the well kept lines on the road. Also love how you made light of your car literally stopping in the middle of a dangerous road for humans because it thought a car was in the way while said car is parked on the shoulder. That is a dangerous failure right there that you decided to play up for laughs. Tesla has the AI disengage before a crash so that the crash isn't technically Tesla's fault since you were in control at that time. If you were going at high speed and had something similar happen, you probably would have found yourself killing both that drive and yourself.
youtube
2022-12-05T06:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzR38jJE2YJC_2dp9V4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwDgA3SdQLDf5sbJnZ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzMJHctMlGqNRmlBnF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzXj4cvD2gZ0wmxyj54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzEBeeTvk6Bnj9IxIV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzBJWwRynO7e1JEHLx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwmoCErVbpsF774dAZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzKACy2THfXpvj-EN54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxsBZfVwaxTrF0CsrJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzjBnljdvS3gNr45JJ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}
]