Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
There is more meanig to those terminator movies than just killing humans. It for…
ytc_Ugz76W5pw…
G
This is a massive point that usually gets drowned out by the "intelligence" arms…
rdc_ohv3pc2
G
Woah, the artist mentioned that he is an AI Artist(which sounds so bad, but it i…
ytc_UgyDk7w4j…
G
Such a pity the debate got stuck on the meta level. OpenAI has been fine-tuning …
ytc_Ugw6IOqZw…
G
@simoneoliveira5806 Hahaha, thank you for sharing your concern about the robot-h…
ytr_UgzEodrgZ…
G
Since you created AI
Why would you have a concern of AI exterminating humanity …
ytc_UgzgqD44W…
G
This style is dreamy, is that from Midjourney v6? I’ve been doing similar and pu…
ytc_UgzAkM7ux…
G
It seems like your comment might have a typo or might be in a different language…
ytr_UgzSOWZRP…
Comment
All the people who are saying ai job will kill all the senior engineer.. they just answer one question.. do that AI agent take responsibility of code.. can we sue their creators in a court if the AI agent make mistakes.. will they pay all the loss a company bear if their agent made mistake in production.. the answer is no..
And another question will the tech industry survive if tomorrow all the senior devs vanished from the planet.. the answer is no..
It's a hoax they are trying to create to sell the product which don't solve any real problem
The AI can only solve which is a closed system.. like chess bot or any other game bot.. but engineering is not closed system with millions of possibilities.. and experience is not sold in the supermarket
youtube
AI Governance
2026-02-08T17:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzFFng3c5RYt1eYKEB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxnVGlej0i1n4sLDb94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxrFK3lAj4puqEwqJF4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzMX2saRqBfbIiaYGx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyUviqs80v8Xv3VF3F4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyWRYQ0egfCC8skGi14AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyIbyh8I9_C6r4roe14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxLPOtzQj9KheDN1rd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyR5JGCdprueqaelMZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxaTUbC1Slyo41V3uN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"outrage"}
]