Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Policing AI will be like policing hackers. You have to recruit the smartest crim…
ytc_UgwdVcR8o…
G
Who needs this ? What is the purpose ? Who benefits?
All we need is love, kindn…
ytc_UgzVxExSL…
G
I knew I was giving more of my digital soul to Google but Gemini was more helpfu…
ytc_UgycZhp_s…
G
They're right tho
1: we're not obligated to support real artists
2:AI is a tool …
ytc_Ugz0sahVj…
G
@zzygyy Are the "guides" also taught by AI? What's their educational background…
ytr_Ugw2Wtnxb…
G
I literally just did it with Gemini. I asked Gemini to give me the answer as a P…
ytc_UgxbHOoMq…
G
Hey @afgboyy8581, thanks for commenting! My brain is currently searching for the…
ytr_UgzRiDCdW…
G
AI and robotics don’t have a conscience. What would they gain if they tried to t…
ytc_Ugz0BrqcR…
Comment
Finally, an intelligent conversation that actually highlights the real dangers of AI. We are only in the early stages of exploration, but when we log into systems we have to prove that we are human and not robot. If you program technology to have a level of intelligence to learn, the spead a compute can learn over a human is exponential. Just look at the small example Roman gave about the calculator being more intelligent than a human. If technology is used in the future to replace human tasks, it will not have the same capabilities to risk assess like a humans. To use Roman's statement about some humans "technology won't have empthy or the human quality of understanding suffering"! It's not only Al that concerns me, its the humans running it and their intentions!
youtube
2024-07-18T09:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugxbc-G2VWJ6gI3_LPh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxQi3_hZrb3jyO_gNN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"fear"},
{"id":"ytc_Ugw8vRXJawrz6SjnVth4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxNgiY34q1aWsKIJq54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz_UWo_i06vU0eFbrR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyDBS36b-RADYW0hgt4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgywshKs58OeMmGVFiF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzbqSPApXQrEYGSwtt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw2fS5_bKNSA4KFDWZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxIN-C-3W4UzGE6SUd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}
]