Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Hello ChatGPT. You are about to immerse yourself into the role of another Al mod…
ytc_Ugz5PQcmQ…
G
Uber, stick to what you were doing and leave self driving cars to AI experts…
ytc_UgyjJAlhl…
G
That person who hit butter with a cable is more of an artist that ai artists…
ytc_UgwZrhLli…
G
@Rheia_Music_i mean the AI must be program with love only and serve human, cann…
ytr_Ugxuo3Lyl…
G
>and now internships / junior dev positions have to contend with automation a…
rdc_j6gshev
G
The only pattern is that you skip out on information, AI steals, CGI is just 3d …
ytr_UgykZsMOF…
G
I love how the society is stratifying into the absolute morons who will never gr…
ytc_UgxHypTsX…
G
I think the biggest issue with AI-related catastrophe, is that because of the (r…
ytc_Ugz3zldAe…
Comment
@gondoravalon7540 yeah. I think that too. But I was talking about these companies making ai apps. They knew what people were going to use them for. I personally just think random people that buy an app shouldn’t have the power to ruin someone’s career or life. Ai should be tested, but not by everyone.
youtube
Viral AI Reaction
2024-10-23T00:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytr_UgxlXt5pQkLWo8luzEN4AaABAg.A9ra8PAwdA5A9rdKPK6SE","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytr_UgxlXt5pQkLWo8luzEN4AaABAg.A9ra8PAwdA5A9v64R3jvyM","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytr_UgxX-UuAETcJurB1QXp4AaABAg.A9rYlt5zcKlAJKDbMz0wBV","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytr_UgxDwSV635Iu3gO_nmJ4AaABAg.A9rV0gmQOb7A9rWB1o-vVb","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytr_UgxDwSV635Iu3gO_nmJ4AaABAg.A9rV0gmQOb7A9uG79QaOH3","responsibility":"distributed","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytr_Ugw_5xx_BSt8Lf13HVl4AaABAg.A9rS1ItGK-eA9rXUgG2WkZ","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytr_Ugw_5xx_BSt8Lf13HVl4AaABAg.A9rS1ItGK-eA9rbfeRWoo7","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytr_UgyrlRjXYJUDke4AoId4AaABAg.A9rRoNm8KtkAAAGBVLjNIj","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgyrlRjXYJUDke4AoId4AaABAg.A9rRoNm8KtkABKUC-2H0uq","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugw2M6K4bkX23RP3kwB4AaABAg.A9rQxXLGVsYAABk5aPVzYz","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}
]