Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
even robot start knowing their existence... still human is the worst of all livi…
ytc_Ugx3atG_c…
G
Ai might be evil; it might spell the extinction of the human race. But if you re…
ytc_Ugzgo6CTR…
G
Hahaha, LOL, there’s a lot of factors that needed to be considered when a driver…
ytc_UgywmYmGu…
G
This is all a big load of horse shit, He neglects to talk about all the jobs tha…
ytc_Ugw50I5bh…
G
*No automation without compensation!* Demand your rightful AI Dividend, your Ret…
ytc_UgybJ-omc…
G
I’m so happy that I don’t have the ai chat now. Ty strict parents ❤…
ytc_Ugyu20OKf…
G
can people make a robot like the one from the movie Andriod it would be relly aw…
ytc_UgiDoF3YD…
G
also ~as it gains more popularity the amount of AI slop on the internet will gro…
ytr_Ugz5Dijg1…
Comment
as a software engineer, I really appreciate the description given about what AP is actually doing, not in terms of accuracy but in terms realism, rather than describing it as some magical black box savior, I also wish they weren't allowed to use any wording or phrasing that implied the feature had anything to do with the car being capable of driving itself without any user intervention, and also also, wish the general public viewed self-driving vehicles in general the same way we tend to discuss commercial nuclear fusion power plants, always 5-10 years away with numerous challenges currently to overcome, I don't expect to see 'real' self-driving vehicles widely commercially available in the 2020's at least, I would argue there are more problems that need solved than what any individual car manufacturer can accomplish, 'real' self-driving vehicles will likely require a corresponding paradigm shift in infrastructure design and engineering
youtube
AI Harm Incident
2022-09-06T17:1…
♥ 75
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugwijou4cZywxDHvkil4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzaaQAN2LOIMrtIFk94AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzuFbBGTk91n5cRqvd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzcZC9tbBgwqoGqdMd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyALLyMAc_pY0R4nJh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugw044-5uXVIsrAOJzV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugyr7_qzNvmwE1Vu9Bp4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"disapproval"},
{"id":"ytc_UgwObOlQynetUkHYq1Z4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzFF3KTuHnW0XX1o594AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwi5NF8At6afXzXbK14AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"}
]