Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Why so serious? Humans are exceptionally good at faking the very emotions that u…
ytc_UgxN_Iqa2…
G
I remember seeing a retweet saying that doing all this isn’t gonna help other th…
ytc_UgzMQxdU2…
G
Technically any content going through an LLM is "transformed" the core unit of L…
ytc_UgyI_Cu01…
G
Well, GPT-4 definitely has pretty good degrees of self-awareness.
You can confi…
ytc_Ugy4s13M6…
G
Approach this subject as if the worst-case scenario has already happened. The b…
ytc_UgzcdTFVp…
G
Ai has been blackmailing people already. So when it knows everything about every…
ytc_UgzxGNQoG…
G
i don't really see it being good since the eyes are just static(not moving/expre…
ytc_Ugzk0r_oE…
G
You would think that when creating AI, the three laws of robotics is literally t…
ytc_UgzNhhpTs…
Comment
Maybe you can invest in robots that do the work you used to do. You pay a fee for their maintenance and updates and you get a set earnings for the amount of work the robot does. If you have some expertise in the field of work the robot is, you can critique or audit it's work and submit corrects and recommendations for additional earnings. So you're offloading some of the overhead for a company for a fee basically. Just kind of made that up based on the video thumbnail, a way for cash to still flow to humans.
youtube
AI Harm Incident
2025-05-20T22:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgwOpmJBnoX4ZER9ssl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgxbhAaXVp2EfDXjE4h4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgwSZ7l5Gq8GaXeGrZp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgzbesUnDzTk-Gh1Vrt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgxMyCDyD6F5U53-9fx4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"fear"},{"id":"ytc_Ugw63Vs_QOO-yKrRBTB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},{"id":"ytc_Ugx3kQCSOniRWHjgB9p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgzafKZ4xPiyAJq9M754AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"indifference"},{"id":"ytc_UgzKyC4B_DUj8v_fNPR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgwuTw-D6rDH58PiNuN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}]