Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The question regarding the part about "...tqking directives from humans..." tend…
ytc_UgxFd9C1j…
G
I remember my friend Mr Elon musk saying AI is the most dangerous weapon ever bu…
ytc_UgydrGzAB…
G
@disorderandregression9278 the difference between that and ai is it takes money,…
ytr_UgwBxQCSx…
G
I would argue that using AI to create art most closely resembles having a piece …
ytc_Ugzlkt45_…
G
So sick of fake robot vids. Show me were they truly are at today .…
ytc_UgyorCV3_…
G
Tech layoffs aren’t about AI replacing workers — they’re about companies realizi…
ytc_UgylrNBy9…
G
AI couldn't have come out at such a worse time over the course of the past 30-40…
rdc_nc9le5d
G
Good! No American jobs lost here. India on the other hand we'll probably see som…
ytc_UgwBc7LOd…
Comment
In defence of HAL, he's very proud that the 9000 series has never made a mistake or corrupted data. Then he predicts a failure on a piece of equipment that tests out fine. This worries the two astronauts as HAL controls the entire ship. What we discover towards the end of the film is that HAL was aware of the true purpose of the mission; but was instructed not to reveal this information to the crew, in effect he was instructed to give the crew "corrupt data". As an AI his solution to the problem was to remove the crew and continue the mission on his own.
youtube
2025-09-17T23:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugya2WO17k7qsrjqVc94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyeswcyBRkFBubeumx4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwoerbfoBX8bU1zc5V4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwKy-2hDLLy9Wj4QJl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugyyp4o9kovY6y0oyTp4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx4JeV0WMg8bODzW_l4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwMy0eAtC6fWwrXhmR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgzwAiOeI7Lb01xjTrZ4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwUNT2wEE6SjArGph14AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzS0tpW4ETPklQeBK14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]