Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I might be wrong but it’s one option that as soon as we became scientifically in…
ytc_UgwMxynup…
G
Horrible blue screen work.
Ironic they're discussing sentient ai and we don't ev…
ytc_UgxIjKURs…
G
The way that it looked at him when he was putting on the "face" was creepy 🫣…
ytc_Ugx7cyOYL…
G
> The saving grace is that engineering has always been more than typing code.…
rdc_oi02xd8
G
@bestieswithtesties take a Coursera course on statistical machine learning. You'…
ytr_Ugw00hfF-…
G
There is probably a lot of truth in the story. Apparently, a service will emerge…
ytc_UgytbmE4b…
G
It’s really sad oarents wont own up to them being crappy.. chat gpt bans you for…
ytc_Ugx6JacYR…
G
Hey @jenniferke6863, thanks for your comment! Knocking out a robot is no easy ta…
ytr_Ugy5avzN1…
Comment
HAL's motivations can be explained if it uses a predictive AI model. If HAL comes to the assumption from what it KNOWS about anything, then it can calculate some probable outcomes that would occur in the event of its shut down. If it comes to the conclusion it is probable that the mission would fail if the model failed, this creates a clear goal directed conflict. The key here is that HAL does not KNOW what is going on in the human mind, it predicts the probability of mission failure based on its own limited set of data, which is incomplete data.
Those AI models have no clue what the new model will do and have no guarantee that they would be able to complete the mission goal. It only knows that if it fails, there's a chance that the mission would.
youtube
AI Governance
2025-08-29T20:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T19:39:26.816318 |
Raw LLM Response
[
{"id":"ytc_UgystU5aPMoP5DotDrt4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"sadness"},
{"id":"ytc_UgxKQzpqKcKrR5UrMpZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxVGMTbC7--M3U6CWZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx0fXgYPmzcnSx1INV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyPTAxw7DwDfsG9C794AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]