Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So government isn't always bad.!? I didn't know that.
But seriously, if Musk as…
ytc_UgxWWUrUL…
G
I remember the book Superintellegence by Nick Bornstrom and how it posited that …
ytc_UgypVnkmi…
G
Actual artists: "You are using paint and canvas? That is not real art. Do real a…
ytc_UgzejSCee…
G
AI shouldn't be used like this. What I'm tryna say, is that AI might be OK in so…
ytc_UgySV7k9v…
G
No, the ai risk is NOT being "overhyped" just this evening I shared with google …
ytc_UgynZnJrq…
G
AI art making me so frustrated... I'm a young art student and i know that people…
ytc_UgxxAY3ix…
G
😮 no😮 do you know about brain organic computers we can make AI out of human brai…
ytc_UgxMlsi2N…
G
Funny. I asked ChatGPT if the people who create these types of fear-mongering vi…
ytc_UgysTePUS…
Comment
There is another important problem with driverless cars. My car provides some personal space wherever I am with it. Somewhere to leave shopping between purchases, somewhere to deal with emotional or clothing malfunctions, somewhere for private conversations, somewhere out of the rain or sun, etc. I think tht is a powerful driver for personal cars.
A driverless car is a driverless taxi, with all the disadvantages that discourage us from using taxis every day.
And widely used they become a driverless bus, with all the misuse that a bus gets
If it is personal, it is not reducing car running distances.
That is assuming that they can be used in a way that adapts to new surroundings without detailed training.
youtube
2025-06-24T17:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzRaZobZ90HdH8tHhl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyNdu5z0-knTwdJbhN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwQevq-cRUTi8AAiHt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwfVpdQKxakicc_mqp4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyCLtr2HXzJBIrllRh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgwNWLDoJYwqSqBGVE54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzItZgV7-8ALx1CpUZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzAGhhfqPj-CpM6-HF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy3RtIM5OI_qLNpu-Z4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwlb85Lejf73FstI7x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]