Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
CAN YOU FUCK IT????
Engineer: Excuse me?
CAN YOU FUCK IT?
Engineer: Noo.
CONSUM…
ytc_UgzMKFzOo…
G
https://youtu.be/zvl6eBTsfVc?si=9a3PWxfEYJS7gbry
It happened to factory workers…
ytc_UgzChoWHl…
G
I wanted initially do defend the parents but then I read The New York Times arti…
rdc_naskke8
G
The idea that AI will take over is stupid. Computers don't work that way. If a r…
ytc_UgximfIqg…
G
As a former artist (I had to stop drawing due to medical issues effecting my han…
ytc_UgyIxgQfx…
G
ITS END TIME ANTICRHSIT EXTERMINATION KALKI AVATAR EXTERMINATING THE EVIL DOERS…
ytc_UgyLMZmRQ…
G
Wrong. There will be a new Middle Class: those who know how to use AI and those …
ytc_UgxX7APIB…
G
@BrendanDellThe publicly accessible LLMs right now are dumb. AI companies have …
ytr_UgxZnVDLt…
Comment
All these futuristic scenarios (driverless cars, Amazon drones, robots everywhere), are always done from the perspective of the west, indeed the US, even if they claim it’s global. The reality is the infrastructure in India, Africa, SE Asia and much of South America simply isn’t geared up to become these technologically advanced, this quickly. So if it happens, if AI wipes out humans anywhere, it’ll likely be mainly in the US, which isn’t necessarily a bad thing….
youtube
AI Governance
2025-08-04T15:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxZOfM7b67FLJbA1H94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwOEpfwm9-6HRQvHk94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgypjNc5ozp8XqxnTXZ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwjLNvwg3VjkU0F1954AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyKRBzN8Q08SKo-tu14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxphkgvocdXRCzl7_Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw7dl_DxGourg-FxYh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzJUniGrYr_SfThUPN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwVe8JHfhvWn9W5Rop4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"resignation"},
{"id":"ytc_Ugyd5PyLspjSc4LeCR94AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]