Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This should scare people. Every step forward these people make with robots is on…
ytc_UgwbDa02U…
G
Despite appearances, machine learning isn’t self-sufficient. Every model depends…
ytc_UgzbMZgwc…
G
@Some Random Fellow better than yours, I have made art that made people ask how…
ytr_Ugz3BPYlk…
G
There's 0 percent chance this type of AI is going to be used to let people relax…
ytc_Ugz0Vkwrl…
G
"Youre example of a flawed mona lisa is also wrong imo as you even hinted-the AI…
ytr_Ugyggs1f6…
G
The biggest User who profited hand over fist using it to grow his business now t…
ytc_UgzZntcsJ…
G
I think when we all get inundated will all the AI generated crap, people will st…
ytr_Ugw5DYcM-…
G
These "AI advocates" are dust.
I don't want to see some disembodied notion of "L…
ytc_UgxqbQmIB…
Comment
Wondering: Tail lights seem a little small and much more clear than red. Is that true? (Red to me would indicate a better danger warning than yellow or white).
Also on autopilot when tesla slows down do the rear lights flash (drawing more attention) or do they just turn on?
Edit:
I'm always wary of machines that do the thinking for you. It's okay if nothing unusual is happening but in life 30% of the time something unusual is happening (whether you're aware of it or not). I trust my reaction time, experience and brain over someone's algorithm at a desk.
Autopilot is best left to planes with 1,000's of feet between them and the closest aircraft.
youtube
2022-12-21T05:3…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | mixed |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwEh2u7XTb3HVdc1g4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyvjL020MlgSx9UyJJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgwVq0btNGXpHpdK6LB4AaABAg","responsibility":"company","reasoning":"mixed","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxYa2ly2s5jKgViAhx4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzEw-4UpMScv7BDuJZ4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzu03TL-3g9YxFO60J4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"disapproval"},
{"id":"ytc_Ugxk0VPt0B8OSf1JQ-l4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyfHb73Cq0PFX5Qzwx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw_FfOHqtFjxx-zyQl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugxl1yKQG6The3HB_EN4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"indifference"}
]