Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Those AI Centers use huge amounts of Energy...huge!!! Long before AI will destro…
ytc_Ugxcer26q…
G
This is a great video for intro to philosophy students! You get to watch AI be b…
ytc_Ugw3RMa1H…
G
I’d be more worried about AI if I was a white-collar worker than a truck driver…
ytr_UgxIHq0e9…
G
This aint real cause the other two people were also ai or as i should say is tha…
ytc_UgwyB_uih…
G
Every single one of these crashes are the driver's fault. Full self driving a ba…
ytc_UgyditRcY…
G
please keep doing this. Destroy AI please. There are so many AI tracers + scamme…
ytc_Ugy6SQe9i…
G
You asked AI the wrong way, because you direct it to choose a side, next time as…
ytc_UgzGdDRjN…
G
@douglaserb1 It is. Judging from how things are developing in China, it will be …
ytr_UgwEvVbag…
Comment
I’m not really worried about losing my job. If 90% of humanity lose their job, governments will have to radically change how society works. Everybody always assumes a superintelligence will be malign, but we have no certainty of that. It could be benign. Why is everybody so certain that it will want to kill us all. Maybe it’ll enjoy taking care of us 🤷🏼♀️ And I could easily find another way than work to spend my time. I have no way to influence what happens with AI so I chose not to worry about it. But I also don’t plan my life more than 5 years in advance nowadays because everything is changing so quickly.
youtube
AI Governance
2025-09-06T09:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxeJUYHlnMvIsNut4N4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwovvLx0diOJ4oBzkV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgytcYUlWTYs4XSPylZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxIT86gumpvTmEfoBR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugx12VeK1B4RDJb63X94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxQr19l5uaiBQ09sMR4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgysYwYMrYdO3AlI3El4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyxnOBafrGTBpG3ieV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugya9Zn8VR2EzUuICFF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzW1MfKoOdl3vmwEBZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"}
]