Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Let's embrace the change, learn new skill and stay relevant. Was car invented, d…
ytc_UgyWbTsKw…
G
I doubt that much of this has made it into the actual training data of the model…
rdc_mk6rnxp
G
I personally love ai art because I suck at drawing, and I never want to use some…
ytc_Ugz8-72_E…
G
There are plenty of shows in which Geoffrey Hinton and other AI connoisseurs war…
ytr_UgwJQyvEP…
G
Remember calling yourself an AI artist, It's like calling yourself a Chef but co…
ytc_UgxLdswHP…
G
Ask their A.I. what do they do when they aren't talking to humans? The answer w…
ytc_UgxtIwfJ6…
G
Not surprised that the government wants to deregulate AI, especially the parts t…
ytc_UgzBa_MVS…
G
Anyone who is influenced by an "influencer", human or AI, is simply an undereduc…
ytc_UgwquaTzU…
Comment
I am just curious, If we live in simulation, then why does he care about AI and infinite life? It doesn't matter, since the simulation loses sense if it runs forever. The goal of every simulation is to yield result, but if the simulation is infinite, the goal can't be achieved, so it inevitably will be cancelled by those who run it. Like we do with the software that went into an infinite loop, you simply kill the process.
youtube
AI Governance
2025-09-07T01:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz0ZoTGeW5tkDY3jKd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyO4LiTX6n-ghy7RKR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx3JgU_6iUZpejvNb94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzaCYgjseO0F00ezG94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzYmU4f6sOub8Vi4954AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx-7WTfycmsA04yU2l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx4dyLDCEmSfaIL6Th4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwKao-hPMKf2q5JgHF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgwO5jifwY-Cr93RNEV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugz-QPJ_2BWAf0iC7uZ4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"}
]