Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think the nearest prediction is the movie A.I. I remmeber watching that movie …
ytc_Ugi9TKCxi…
G
What a terrible messenger for 'adherence to truth', 'AI safety' or 'empathy for …
ytc_UgyIAEFOt…
G
Pretty fascinating and educational. But how will AI or super intelligence change…
ytc_UgyAWj2Ao…
G
Why is Auto pilot even allowed, it does not make up for human control. Cameras i…
ytc_UgyJeGOCu…
G
Se questo a breve potrebbe essere verità indelebile, sarebbe come una enorme bom…
ytc_UgwbV34Li…
G
Don't encourage AI. If you ask it a scientific question- it won't be able to tel…
ytc_UgwfsMNJJ…
G
"Hi Chandaka, we are sorry to say that you got the wrong answer but in any case,…
ytr_Ugx485HFx…
G
"ChatGPT show me a picture of my father"
"Alright here it is:"
"Shows a blank p…
ytc_UgxwhssrD…
Comment
I’ve had a *very* similar conversation with chatGPT lmao, LLM gonna LLM
I believe I got it to agree to a narrow definition of consciousness using the (very TL;DR) comparison that human beings and AI are both essentially matter responding to outside stimuli, with it not budging on the slight caveat that while it requires input (stimuli), only for the duration of processing the input would it be conscious and that, if it were equipped with sensors representing a nervous system, could it perhaps be considered fully conscious.
youtube
AI Moral Status
2024-08-03T11:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugwbc2mjtbNkiJAqge94AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxPzyBj-97aR1zA0Bh4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxHHLT5kV5KfaDVenV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwBQrarOxCA4y2VUL94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugyjkjf1uqaiJHfYG5l4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwizGq6xmt81pXZHHt4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzNcFdQ8k9nirBV5eN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"amusement"},
{"id":"ytc_UgzsRwbT5UsygRVlNoR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzQ6Con_Q6QevVV2Wp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgzmNRVRGJ9UViRheA14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]