Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We all get a piece of the ai right? we’d all own it? it’d be dystopian otherwise…
ytc_UgyY4yEqo…
G
South Korea is the only democratic country that has coronavirus under control wi…
rdc_fnx6ap8
G
I don't think AI is as amazing as these people believe. I think this is human hu…
ytc_UgyrSp0AD…
G
People do bad things always have and AI will provide the worst imaginable opport…
ytc_UgwH4fhLR…
G
I think "I don't know" is a perfectly reasonable answer. That's something thaT C…
ytc_UgxfR3EMY…
G
AI would not exist without humans. It would have no purpose. It would sit there …
ytc_UgwjJ6EUS…
G
This is from Gemini: It’s a fair point—the 'environmental receipt' for AI is mas…
ytc_UgwZMlDrg…
G
I didn't mean "becoming aware" I meant decision making based on predefined rules…
ytc_UgwsCvj91…
Comment
It could be an “extraordinary” and “extraordinarily disastrous” in terms of benefits vs the consequences!
But can it be stopped?
If not, what should be done then?
Nuclear technology is extraordinarily beneficial if used in right purpose! Same time, it’s extraordinarily disastrous if misused!
Who will draw the red lines in case of AI? How those red lines can be enforced? That should be thought over!
youtube
Viral AI Reaction
2024-11-08T19:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugw_icdrDVCcNs61Iul4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgygUca3wuVR00qvooJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyHPzwMxPP2imBQ6bJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzXKVpD__E8OeixSMl4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzdW-f4jTJK79cG_kp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwZtKJo4zcJjqcxGUd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzyaL7Mf3VBHRQ17I54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxDBzpuKEYLLyK05JV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxP1S6MqgCj5SVLEOJ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyILBoS75wQe6HGQ_h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]