Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
1st of all if AI KILLS IT IS BECAUSE AI WAS TOLD TOO THERE FULL OF IT AI WOULD N…
ytc_UgzeNXtgZ…
G
Computers and A.I
didn't birth self and decide to destroy humans/ humanity. Gre…
ytc_UgwEUR0bf…
G
This is only a problem because most people are dumb slaves, not directly because…
ytc_Ugw845u_b…
G
Perhaps researchers and companies should stop saying what they have developed is…
ytc_UgzdQ8KuN…
G
They can actually have some incredibly deep conversations when you talk to them …
ytc_Ugya_JS6h…
G
Autopilot doesnt mean you can not focus on the road. Its just automated driving,…
ytc_Ugw0eqJEv…
G
It’s a program that you put use to essentially make your art completely incompre…
ytr_UgxymDU9f…
G
Reel art: basic supplies.
AI art: Demands nuclear power plants, massive clean wa…
ytc_UgyNWEVpN…
Comment
The fact that the Echo story was written by AI is kinda mind blowing. As if it may be not quite possible to take over the world at this point, or maybe the AI just doesnt want to. I do wonder though if our world is not quite digital enough to be connected enough for it to function alone, we still have humans doing a lot of jobs. If all major materials are robotically manufactured and sourced from the ground up by robots then it could actually provide for itself
youtube
AI Governance
2024-05-25T08:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgwBt-r4d8XDChlmOfF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwio6AQlFx4Up6q8Eh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxMQ9iJcnZ3IJJ4RRB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxuFueYLKZ_LXszbGl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwq0M04tcCY3K-ZJTh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyZoCNu7m5ErULXBeh4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugx6qZr3m88UjQANVsV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugyes8I9SUC9fpMnXYd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwwWFIo5dPgoh7z9eh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzF7BGGrAHRKNZilVd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}]