Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
In years past it would be scary... now knowing how people in the USA "really" ar…
ytc_UgxRWjDhR…
G
Do you "create" the art with photography?
The amount of work you'd do to take a …
ytc_UgyqwNgCq…
G
how can someone compare digital art to writing a prompt a 7 year old french kid …
ytc_UgyP5Z1jJ…
G
That new surplus the capitalists will obtain from using AI driven trucks, should…
ytc_UgxAzuTkJ…
G
Government regulation is a bad idea, because there's no constitutional basis for…
ytc_UgzkRC0Bg…
G
Yes lets ask an astrophysicist about the workings and dangers of AI. Same thing …
ytc_Ugywmk47Q…
G
Nonchalantly talking about how this thing knows how to deceive humans. Data can …
ytc_UgxlqpdXc…
G
The second most irritating thing about AI art is that some people act like they'…
ytc_UgxBhToXC…
Comment
I don't get why all the people responsible for AI are now the ones saying how dangerous it is. Bartlett brought this up and Hinton said oh we didn't realise how dangerous this could be. I'm calling bullshit on this unless someone can make it clear to me. It seems very obvious how this could go. I can't believe how in every AI leader interview no one presses them hard on this, especially when this is 90% of what these people talk about?!
youtube
AI Governance
2025-06-16T12:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyAiTOedrBS8WNTDGd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxqLOJHMpGxwaQbFtZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyY04CCzB8EuCV5_bF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxTvpJrg-VRAsZ6zpJ4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyf_7ygdN7dVADAw6B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxKQ8402Egi5bDRRfF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugz8_ThM8byOBjplkQR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwfkGfYHhmjfE6sTPd4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzB8GwtjR1rjEJbOhR4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwnAFwiAX2Nn3_VMhV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"}
]