Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'm cool homie 🤣🤣🤣 that robot will be fucked up in the memory seeing the activit…
ytc_UgwPheSok…
G
I remember there was a time when people feared that tools like Hatsune Miku woul…
ytc_Ugz4-bIPf…
G
got high and ate a bowl of mashed potatoes at 2 pm, out in the summer sun. i'd l…
ytc_UgyPMmWsy…
G
Funny, but for real some smart people find ways of dealing with unhelpful though…
ytc_UgyfTqpQZ…
G
Ever heard of the movie irobot? That's what's ganna happen with these stupid ai …
ytc_Ugz3Wnk8U…
G
The issue that people like Elon Musk, Thiel, Gates, etc.. have aren't with A.I.,…
ytc_Ugi28m3CG…
G
ngl now when I'm looking at a godly beautiful artwork I can't differentiate whet…
ytr_UgyKcL0L7…
G
It eill take decades, after we stop wars, build quantum computers and androids, …
ytc_Ugwd2KOjs…
Comment
I think you are right, that autopilot doesn't mean autonomous, and people should be ready to take over at any time - as does Tesla because it requires hands on the wheel at all times. But I still think Tesla calling their system "autopilot" is dangerous because a vast majority of car drivers don't know what an airplane's autopilot means, and it likely causes a lot of them to think that the system is mostly fail proof and so they greatly lower their attention and guard because they think that the Tesla autopilot mode is going to handle everything.
When safety relies on the lower common denominator, to me it makes more sense to build it and name it appropriately knowing that most people are Morons and don't understand what 'autopilot' means.
Calling it autopilot in the aviation business is fine because everyone is well trained and experienced, and also in charge of many people's lives so they know how important their role is.
Calling Tesla's mode "autopilot" likely just causes a lot of laymen to over-rely on the system while they drive only themselves and a few passengers like they do every day.
youtube
AI Harm Incident
2022-09-07T00:5…
♥ 4
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgxGFRO40uyLs743S5l4AaABAg.9f_h_lmNDG49fcWjV5ST1T","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgxobKS_erN63elZXjh4AaABAg.9f_a1Hqi_7l9fbSmUtYvnp","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytr_UgxobKS_erN63elZXjh4AaABAg.9f_a1Hqi_7l9fbTEJekhWM","responsibility":"none","reasoning":"virtue","policy":"regulate","emotion":"indifference"},
{"id":"ytr_UgxDw_hf_yKQUFR5B6J4AaABAg.9f_Wn24aPFn9fenuOIXq0-","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytr_UgxDw_hf_yKQUFR5B6J4AaABAg.9f_Wn24aPFn9fesqHFLEiO","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytr_UgwA-cuAlrYFrmRrmFt4AaABAg.9f_82ZYVkcS9feQ47I6m8q","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytr_Ugy428XSf8iGdC4L1Wd4AaABAg.9f_66rDSbnc9fbE4RBpxo-","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytr_UgyWzPD4vYfX32DHyzF4AaABAg.9f_41m8mCQv9fdQ0viC_09","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytr_UgyWzPD4vYfX32DHyzF4AaABAg.9f_41m8mCQv9feF4WLy4x0","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugyvgjaq2iSqCaKCETl4AaABAg.9fZyR6xVMg39f_-0TZPBxt","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]