Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I'm soon gonna be uh fitness trainer and a cinematographer/camera man. So I'm ok…
ytc_Ugwsj6k4z…
G
Here's one: If the AI Agent is so good, create an internal one to do the cold ou…
ytc_UgwK6zt0X…
G
Listen.. why you people still doing this? Lemme just.. ok so it’s simple: use AI…
ytc_UgwabJYT5…
G
I can’t draw very well, but I have never used AI art as anything more than a jok…
ytc_UgwbtOjRN…
G
O and one more thing about knowledge and your right to free trail a car as you p…
ytc_UgwGJw_kT…
G
I am an artist who is pro-AI. (To be clear, I draw and color my own art.) I agre…
ytc_UgwtbGrRy…
G
we have miserably failed social media, yet we won't do anything about that, and …
ytc_Ugxgt3n2p…
G
if this is a simulation why would the ai destroy it seems like it would wont out…
ytc_UgwdEojch…
Comment
Fictitiously, inside of the video game Star citizen, if you follow its lore, as humans we have done this three times, and it has ended with laws that restrict, and almost downright prohibit AI usage all together. Today's sci-fi is tomorrow's reality. People have thought this out and we know where it goes. The reason for this, is just people trying to protect their investment. And we're going to get burned for it.
youtube
AI Governance
2025-07-02T01:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyO1jniw4dj-mB3SwN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyAZI192EzNKfNufnJ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgyDbkavFER8pnscvuh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"resignation"},
{"id":"ytc_UgzE10Ss_KzzYLNHU4d4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzhR_mQiXPix587Gcp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy8-dvQZkCNQSDo4_V4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwPh2R-hEv-qhGMvV94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzUrNYnVoCUQ5Im5Qx4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwVt6syoI27E9BcDp14AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugyd43o2bXImFfMfv9Z4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"indifference"}
]