Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
the human brain is also an algorithm that senses the world as an input and outpu…
ytr_Ugyc3nEOY…
G
I don’t have any plans on learning how to draw, just because I don’t have the in…
ytc_UgzYu3wq6…
G
copilot is good to complete some monotonous Code not really for much else so far…
ytc_UgzeIwVJ8…
G
AI "Artist" aren't Artist, because when you write someone what to Draw, and you …
ytc_UgzcwGIE5…
G
@Passbu If you're bothered by the concept of being alone at the end, the AI is j…
ytr_Ugxsu1BlJ…
G
This whole premise is retarded since ChatGPT has been told to always say it isn'…
ytc_UgyS9lO0y…
G
This ai robot: arrgh i worked for 15 minutes but i didn’t even get anything!! I’…
ytc_UgzsraO7A…
G
Before
Robot: Set up me first, than I will setup you
After
Robot: I'm built up, …
ytc_UgwC3S1b-…
Comment
If AI is smarter than human, we are doomed for certain because we will not have one AI, we will have multiple versions of it, and are you sure no tiger if raised well, will never kill its caretaker. One will, and one AI can destroy the whole humanity.
youtube
AI Governance
2025-09-23T17:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugyb1qZjew2M5d5yKPV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwQr52st8Ue-gU4DvR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzhWF259V0ZNGsuo0F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwW2FwINfirLGC8RmV4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzMzkGxFR1bjhPhmxN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwzZahOq4i_U5ycpL54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyUzyaNfXD5y_blPYF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw27UMJUSMNeALYljJ4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxrHwQnq7QtfZgyNbl4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugx-WX4HW54A1qGJQ0B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}
]