Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Sora 2 is a trainingground for AI to understand the world around it. Creating a …
ytc_Ugx2mNarW…
G
@user-uo8ny1kj4c that I am, you honestly read my comments and thought I had a sh…
ytr_Ugw-nxvQ-…
G
That robot said kill all humams but he stop at hu, I'm a chainsaw so they won't …
ytc_Ugyk0DyPd…
G
This AI tech is not a marathon iys more like a forty yard dash everyday new fuct…
ytc_Ugygqrb7X…
G
I notice you didn't ask "Dan" how to make a bomb. CHATGPT is just role-playing, …
ytc_UgzTu1-K0…
G
well dose any one do cares of 3D artis cuz most of the time my frinds say me I …
ytc_UgzqTqoX6…
G
If an AI was sentient, surely it would be atheist? Unless it liked bullshitting…
ytc_UgxaCXfWr…
G
If officials in a foreign country confiscated my passport I would be demanding t…
rdc_cjouph8
Comment
Fake fears. AI won’t become sentient (at least for a LOOOONG time) and he only wants regulation so he can have a monopoly on LLMs and GenAI. Your fear shouldn’t be skynet, fear capitalism
youtube
AI Governance
2024-07-01T01:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxQ_ojww-m721ZYXY94AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxJMM2r7bFbBrAyD7p4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgxQz__f_s4iM-r_7Hp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyROKr0bwWYAB0xpfl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwzHTSXkrjIe-ABZrJ4AaABAg","responsibility":"government","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgydfKq6ZYLEYXf1G-Z4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzWs7hHZ-c096CbwM14AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw-yR8nfEVQ-G0Glwx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgxqvsEPEaDU1gRl4HV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz2nWESCtfDJH-Qg7J4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]