Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You may say it's not to replace doctors, but the capitalist greed knoews no boun…
ytc_Ugy7FAA_S…
G
@voidberryblueberry I don't necessarily agree it's theft. Even if it is though A…
ytr_UgxTvloZX…
G
The place I go to for anxiety therapy is thinking of getting a robot to help kid…
ytc_UghG2150M…
G
can we not have AI in critical functions of the society? which idiot decided to …
ytc_Ugwp26hmI…
G
He would say that, he has a vested interest.
> said the act of driving is to…
rdc_czxyspp
G
I'd be one that is 100% OK with AI art. Be anyone call it art, or not.
Though, …
ytc_UgyWh05Gy…
G
This is like when that lawyer did a "cross examination" with ChatGPT. ChatGPT ha…
ytc_UgyzM4SIM…
G
AI is not gonna go anywhere. There’s too much money in it and too much potential…
ytc_UgxWU2tLf…
Comment
Ah, but you’re missing a HUGE factor; life extension and consciousness transfer.
Hear me out…
If an AI is programmed by an individual, specifically for ONE individual, estimates I’ve seen indicate that not only could this (perhaps robotic borne) AI know millions times more than us AND will be able to predict our behavior with a >95% accuracy.
I’ve further read research that indicates that if “persistently connected” to us, this AI can even predict what we will dream.
If true, maybe big IF, but if true, then when do we become the AI???
Maybe we should be looking at robotic borne “personal” AI as a life extension technology…
If it knows all we know, and we die, did we?
youtube
Cross-Cultural
2025-09-30T20:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgzJOD8HQL9sU3mL1mh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwW6d4wKbxFmRDVCDd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzt_28T9vnvNAAlkU54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxFr_rEDUmZBPYuud14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_Ugyl-F9r8wRzo1LNoE94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzHN4iojsppKBDD1-N4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxr9NYxm_aQaBRlBxB4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxhLrWDZP9zMdKvajR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwOWErkIXlP1f2RYrV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugy1Lq8kqPxZw3Wduop4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}]