Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Perhaps its more useful the other way. Comparing AI to consciousness helps us u…
rdc_djzneac
G
I am using AI but I can't make a book I am telling AI to make make it on a good …
ytc_Ugwlpz1Ks…
G
at first I wasnt using AI in college when it came out.
but overtime I noticed E…
ytc_UgxTrgPKz…
G
This is one if the many reasons why I feel like AI is in a giant bubble ready to…
rdc_lgmp527
G
A lot of scientists and philosohpers believe that consciousness is an emergent p…
ytc_UgzREezcg…
G
I think we are more likely to make a sentient biodroid than silicon based sentie…
ytc_UgjxYcYgD…
G
So this dude spent his life creating ai and now he's warning us of the dangers ?…
ytc_Ugxc6JMfW…
G
Equality is not about equality of gains and achievements- equality is about equ…
ytc_UgyBNyX0z…
Comment
The Thumb Nail:
Driverless trucks are ludicrous. Why even bother to draw a drivers seat with a driver dashboard?
Can't even get THAT right.
Do you get what I mean?
How are you going to solve the problem of the uneducated people in cars and how they act around a truck WITH a driver? A truck WITH a driver will have compassion and give his life and drive off the road to keep from hitting a sudden stop in his lane. A truck WITHOUT a driver will plow through you. No a computer can't make that call. Especially in bad weather.
Do you get what I mean?
youtube
2018-12-19T13:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgzrhvsO1t5znNaKAsh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzIkFFXsWv10iyQKOB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzZF7ngLX7ou25gHex4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxAYh0boeTvBgftPOh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzPKRBnC62os6VEaFx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzSMeVQFBIB09hJUrZ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgyuTHQMnHwWMMn3e054AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwSja7hR27OiuU3YP14AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwT4zIpw9kwymNqBCx4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxfXVUwNw_EVzDT9u14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"}]