Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Imagine openai would programm ChatGPT to act like an evil conscious AGI as an ap…
ytc_Ugw6x1lHL…
G
Wouldn't consciousness technically be AI writting it's own code?
If text AI can…
ytc_UgzF9jIjm…
G
The actual thing we did not realize yet. Is that LLMs are terrible at many thing…
ytc_UgwGg1Ms9…
G
Business/ governments strive to lower the costs of production, outsourcing by Ro…
ytc_UgxOmXnYL…
G
ChatGPT is not work ChatGPT is so far right fucking wing.
Try to get it to tell…
ytc_UgxSLZeOe…
G
AI art inspired me to work more on my comic. It's going to be a long time until …
ytc_UgyUfyn7x…
G
Obviously 🙄 these are two different people and I don’t need any facial recogniti…
ytc_Ugw-_m5eZ…
G
I plan to do that in a future video! However, this one was for beginners and I d…
ytr_Ugy-EBSVd…
Comment
It didn’t attack, the operator turned the robot on while it was hanging, the balancing system kicked in and the robot started flailing its arms around because it was off balance. When it wasn’t working it to correct the balance it just flapped and kicked around. Its sensors thought it was free falling.
youtube
AI Responsibility
2025-05-06T14:5…
♥ 4
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgyKKfy3nwbCy256MRB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgyAvOJglQRJAGxTrlp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_Ugw21EmLviiyjPFB-a14AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgxnQh9obpqS08_IjXd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},{"id":"ytc_Ugx5Kllb6-WkGDguRAR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgxZZS0fA8FEg4fIY5N4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},{"id":"ytc_Ugw4Mqakf3-pjZBJq-R4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},{"id":"ytc_UgwNYxWpfhXAaO_rBUx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},{"id":"ytc_UgxpusrJK9TL8riDlWN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},{"id":"ytc_UgynTBg1mOboBOVqMNB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"}]