Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think in 20 years AI will take over air traffic control jobs. Not that it’s of…
rdc_j1y5662
G
This is terrible. We need laws to protect privacy. Everyone's already going craz…
rdc_eepf0a8
G
The big difference is that artists that do digital art are actually artists, and…
ytc_Ugyd3sVRJ…
G
Business insider has done worse. Fairly typical doomerism porn though. Self dr…
ytc_UgxEYpyCU…
G
Looking for health advice from AI is like your local methead giving an anti drug…
ytc_UgwFUHrAd…
G
Suchir Balaji (November 21, 1998 – November 26, 2024) was an American artificial…
ytc_UgxHkFArr…
G
I'm going to answer that question for you as someone who works with AI. No it's …
ytc_Ugxs83LJ1…
G
But ChatGPT is just a large language model. How this can help? Is like using a h…
ytc_UgxxUcKQ0…
Comment
Again, your example of an AI misinterpreting a request to protect your children is not compatible with the idea that it is super intelligent.
If you told a human to protect your children and they went out and destroyed all other forms of life, you would consider them to be a complete idiot. It is obvious why that would be bad, but also why it would actually hurt the children not protect them.
youtube
AI Governance
2023-03-30T17:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzwD6Wp3FGOC1k42hd4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzqhf7uWSU6IkmJPpd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxZyhZBunwvN2zJj1l4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx5v1KyCad7gOsEuT14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw2CXobofVqco1aikV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxWyKVgkLVDREXz0jh4AaABAg","responsibility":"user","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyRAcP3-osLT2MLvS54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxXb5eT79mwO21SuxN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugxkn3L6S39tMYDAlDN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzm87-lGrqI37qAZ8V4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"resignation"}
]