Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
if they want to properly create a robot `similar to a human they should stop "mo…
ytc_Ugzd3Ix1F…
G
I believe it should be illegal for children to have any access to AI without sup…
ytc_UgzAVqXaB…
G
The amount of toxic pollution from Elon Musks AI facilitie in Memphis is extreme…
ytc_UgxLvkCy0…
G
I'm sorry to hear that. As an AI language model, I don't have direct knowledge o…
rdc_jeij31n
G
I am still skeptical when any company says that they've "replaced jobs with AI".…
ytc_Ugypdnbin…
G
“Let’s make an AI that’ll tell us if we should arrest somebody or not instead of…
ytc_Ugyy0DV5X…
G
I think one of the biggest problems that a lot of people don't realise is that a…
ytc_UgxK-Mnl3…
G
The camera example wasn't that good. Doesn't take much talent at all to point an…
ytc_UgwRnZVuF…
Comment
The real danger of AI is that it's going to ruin the internet you think your search results are filled with extraneous nonsensical websites just wait until AI completely pollutes the internet with nonsense it's already happening
youtube
AI Responsibility
2024-02-15T16:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyhX1bLYVaXaWys16B4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwJ6XXnt3BknYD75194AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyd0VOOFhIgKWV_qDN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyzuHjd9BKtUxlQSLt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzW5jSwYFEbumylX3V4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugx2gp957etl9p3Ck1N4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw7HpySi8YMZLCCjNx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxMS6s7X58GmHFoiXl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyx-4wX03RPyG2pmFN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxzYJCJ70faZQaS5nF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"}
]