Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It's way too late to think of controlling AI. Pandora's Box is open. And you can…
ytc_Ugwzcufue…
G
@Angelawinterschild井 These Digital artists want everyone to stop using AI but…
ytr_Ugx7vQMxL…
G
Video: Makes criticism about ai and its various cons
AI bros: Haha im gonna ign…
ytc_UgzkwEL3T…
G
I fully believe both companies, not just Waymo, are on a mission to completely t…
ytc_Ugx4lZgej…
G
I believe people who use AI to generate art want to be able to express themselve…
ytc_Ugxp5A-Hl…
G
Ptff I’m deeply sorry for their lost and i feel bad that he passed, it sucks whe…
ytc_UgynYBgD6…
G
I’ve always used please & thank you. I wasn’t completely sure, but always though…
ytc_UgyJJNUFU…
G
How about any money saved from automation needs to first go into paying for a UB…
ytc_UgxIWW2lx…
Comment
Let me posed the other way of thoughts: with the use of AI, more drugs can be discovered, better airplanes designed (more fuel-efficient) and so on. And all these will lead to a huge growth in technologies, science and engineering. No AI or robots can learned these easily. Yes, with higher demand for understanding the higher complexities in technologies generated by AI, more human jobs are created. Yes, AI may be able to handle a lot of the new problems encountered, but since it is new domain no existing AI knowledge may handle that knowledge easily.
youtube
AI Harm Incident
2024-08-09T14:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugy-mGwtqLHOfI-J-Zl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx-e2Vml_HXlKLaZ6F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyNI1yDYLo4wHcnYLt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyEnnjk2JcqVq_-CNV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugwqo0Bc_bqvHBNpx9h4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw5CRmC569Nlwtfqsp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz-vYbz5BRTGJHl3Y14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyZXQGJw2Dpwi16kCt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzvKnVT2YFNYwZAeOl4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzM9lGHQvubrSklIv94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}
]