Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
A lot of my friends used to support ai a good bit, but stopped after I started d…
ytc_UgzhQIwwf…
G
Ai in itself is pretty benign. Like a hammer. It is a tool. That has good use ca…
ytc_UgzjJYjJ4…
G
@xoxo_internetangel You do realise not every AI model is trained on “stolen art,…
ytr_Ugwxi49Xb…
G
A.I. is going to take so much money out of the average person's wallet and stick…
rdc_m80lrjl
G
All the boring, monotonous, and unfulfilling jobs are becoming increasingly auto…
ytc_UgyZLzKgI…
G
Should genuinely be limited to shittier generation so humans can tell if it's hu…
rdc_oi3wsqa
G
Shes talking to herself again.😂😂😂😂😂. If it could sense it was talking to another…
ytc_UgwogQMhf…
G
It’s more likely Internet will be killed by flood of bots, and cinema killed by …
ytc_Ugz2vnyv6…
Comment
40 min in and nothing have been said. Sam Altmans faults are just glossed over, combined with that that is the sole focus in this conversation. I don’t know anything about him, but it’s quite obvious that people in charge of tech firms are not people most of us want to be friends with. The Tesla guy, the microsoft one and this ai dude may all be terrible people. That’s not news. Yes I don’t know them and never will so I can’t be certain on their character.
But folks as ignorant as me fight for the Tesla one against the Microsoft dude, or vice versa. Apart from illusions, that cannot give you anything in this life.
youtube
Cross-Cultural
2025-07-02T17:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | virtue |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwiQ42ZPhUCBkvERvV4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz-4qkHaXHgZgWicl14AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxqpJC-awpXQa5aiL54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw7-C_nq634uCFHS9x4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxp_h4_wTn89uMjzq94AaABAg","responsibility":"government","reasoning":"contractualist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxl8fsJEUWqmzIuYyx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzdSy07_KDIbr67Pot4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzoyN9nRyVSNTs48Lt4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx2pmsO_JiHsu2J4O14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx2MhVADGjvDPqmjOl4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"}
]