Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
My friend's son was dismissed with 119 other high-tech employees of one firm bec…
ytr_Ugxuu6mLE…
G
Some AI edits look so convincing. I ran one through an AI image detector and yea…
ytc_UgwqKS4Oj…
G
If AI will take all the jobs and most of the job holder people are not dealing o…
ytc_UgwM7ZEDW…
G
@andyreactsI'm not so concerned about how much money it takes (as I don't have …
ytr_UgyaG5iKv…
G
AI doesn't even have to be very intelligent to destroy us. Just some ignorant pe…
ytc_UgzErGK18…
G
well ai can always make things up. like maby ai thought it was just a game with …
ytc_UgzmCXT9n…
G
I would argue that depending on how fast this goes, it might reduce income for c…
ytc_UgyDf4gf-…
G
These greedy companies wanted to eliminate human employees. Humans have the uppe…
ytc_UgwuUF0tp…
Comment
I believe that self driving cars, or cars in general, might never be truly safe until we hyper modernize roads. Basically making them smart, incorporated sensors and set virtual paths for vehicles. This would be the single most expensive project humanity would have achieved for quite a long time, and the resources needed would be tremendous too, not even mentioning the missing technology we need to invent. It seems impossible, but it is doable, and I can see it becoming a reality in 100-200 years, if society doesn't end because a fat kid or bald grandpa got a bit too mad.
youtube
2023-08-03T19:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy-hqylfhc7ZoaQqBl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgygTOfSZNd_29mNsu14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwMTqi9TLjRTB46STR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwmLab3KIQRIiHmrSl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyA5GQ7YfEN8Qr1CPF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzn4_ruCYg4FGvuOgF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwywOOVA1I-n34OT0F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz3kRRafQDFP9Pj_dh4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz_KsJrWinrz9-BbmB4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugxd9_TEqH52OzA5kkV4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"fear"}
]