Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This example uses human error to portray a problem with a self driving car: driv…
ytc_UgiL9TsF2…
G
The universal consciousness came up with the idea of creating a lifeform which c…
ytc_UgyMS9bTR…
G
Who is running the mom AI workshop and why are they so efficient. I’m on Teract …
ytc_UgyZc9CVE…
G
It's not bullying. They should learn online. Even if they post AI ''images'' the…
ytr_Ugx1txpX5…
G
Hey chatgpt, using the summary above, please create a long Reddit post that will…
rdc_ktx542i
G
Ill be honest, I don't really care about AI art, the people who pay for art are …
ytc_UgxNTmXKX…
G
Okay, the reasoning AIs is the first time that I have actually been terrified of…
ytc_Ugxu4DLjS…
G
Everybody suddenly gets scifi brain whenever LLMs that are being called AI comes…
ytc_UgxvBMXFx…
Comment
The thing is...HOW AI WILL GET LOSE ?
AI needs computing power to live, it's not a cat thay can live anywhere.
So how mad AI will be released ?
Only by a crazy hacker and that AI will create copies.
The AI will need to live off the grid so it will contaminate a satelite.
The AI might create caos if it attacks banks and the internet.
But the only way to be really free ?
If it replicates hardware in another planet we will be doomed.
youtube
AI Governance
2021-09-04T19:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwMZwDrqAWQX3WeKG14AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxmWj0-Duk-E_v8mbN4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxAsrL7W2vogmdeBMh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugym8HtVa_l-1WNTuUh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzhlq6YBPZ7Yra7w_54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxut42Cf1sSUURfIuN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgyV4aWdLq_4BFCJRH94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyg70GkSx8Md4fdF7l4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgyK18H-5UAQqnWOen54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwn1p--fu5hYSrwjMF4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]