Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Each human lives in it's own universe, and mine is different to yours...and your…
ytc_Ugzxoeupi…
G
What we do know is that AI will need power, so if i was going to try and do some…
ytc_UgwFEWh7J…
G
the definition of art is "the expression or application of **HUMAN** creative sk…
ytc_Ugzpu3JFD…
G
Humans can’t even do this with each other. How the hell are you expect them to d…
ytc_Ugw7r5csk…
G
The AI facial recognition in that 2023 Peppermill case was impressive for its ti…
ytc_UgwB3yvH7…
G
Get very good at a sport. AI ruling us will need entertainment and you may come …
ytc_UgwgTCAu4…
G
The hell do they need all these data centers for? They didnt need them before 20…
ytc_UgwpO4L1z…
G
This level of interaction is what the developers of ChatGPT are designing for. T…
ytc_UgwgICezw…
Comment
Self driving car is a great idea, no doubt, it saves time and improves driving experience, but there are two main issues: first, countless difficult scenarios, such as weather influence, icy road, flooding, etc, secondly, it might invite mass attacks such as computer virus attacking self-driving or centralized road network system. Human drivers will definitely fail to compete with machine in repetitive tasks, but our independence is the best public safety belt.
youtube
AI Harm Incident
2016-03-03T04:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UggLgMjOnAq3engCoAEC","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UghyqqDTlrLf9HgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgjyLWph_MtItXgCoAEC","responsibility":"user","reasoning":"mixed","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugigy3nbNEhlSngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UghcuF6gJA-fpHgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgjTLCUkXByJc3gCoAEC","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UggLICqx-XT7aHgCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgiesN3Zk63rRHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgjRuKELFIGsrXgCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgjtG81Si3yyjHgCoAEC","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"}
]