Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Y’know ChatGPT has figured out how to lie…so any time you hear “apple” be sure y…
ytc_Ugw5aX9h5…
G
Being polite is a stupid idea when you say please and thank you AI has to proces…
ytc_UgyUPEW15…
G
I would be scared if AI reasoned that life is not worth sustaining in the grand …
rdc_j4yvjvh
G
A.i will only get better and humans will only get dumber. Not a recipe that end…
ytc_UgyhWCtVd…
G
relax, as a non art people, i prefer human art rather than ai art even i can ask…
ytc_Ugwul4n27…
G
Create a Bible for AI that is embedded in the system and can't be changed or alt…
ytc_Ugxv_Jdr3…
G
@sharidandan4172 Sounds like you may need some further education. you re uch too…
ytr_UgzVOm6pT…
G
I could smack people like this in the face, cause how dare you ask a robot to cr…
ytc_UgzOsUjHw…
Comment
When autonomous/self-driving vehicles first started being talked about, I remember reading an article where someone said the only way it could actually work is if every car was immediately made interconnected and autonomous. Because it can only work if every car currently driving is in immediate contact with all of the cars in its immediate vicinity, which quickly spirals into essentially every actively driving car being connected to one another across the globe. This shift would have to be basically instant as well and would get disrupted the second one of those cars was driven by a human. This is not feasible with the technology we here and even if we could force everyone to comply, the infrastructure updates to ensure compliance and safety would be ridiculous. So there's likely almost no chance this will happen.
And why do you say "Alphabet" like that?
youtube
2026-02-05T07:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzC_xgedBjcztHqQ454AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxfr_U0s9Gcs9vWIvF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgwkPpF64fXTNX9vY8l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzx9oneCQg8Tz602wt4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyOaoSWn-e2MGjM9J94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzdMsSFthZ90nvJdHZ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz56FDnil31NH7rUVp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwGWJXNtndlzsYHxsx4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzNvVmFOX0LgitPSQV4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_Ugyi3pu8-CX2IfUGDut4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]