Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I hadn't thought enough about feelings being in the body. I don't know if such s…
ytc_UgxNHFW9D…
G
@fingolfin420 Look up how neural networks actually work (channel 3Blue1Brown h…
ytr_UgyDzigDi…
G
This! Good on you for looking past the catchwords David. Although there is much …
ytr_UgwQ2VO8S…
G
As a paralegal, it’s good to know that AI won’t be taking my job anytime soon.…
ytc_UgxAaGyiZ…
G
@exurb1a ... NOW HERE IS THE QUESTION OR SHOULD I SAY QUESTIONS..... Was this vi…
ytc_UgyJNqytx…
G
So no fights, no discussions and it'll never cheat.
Put this in a robot that can…
ytc_Ugww3bed-…
G
When AI and robots do everything, governments around the world will require peop…
ytc_UgxeK_COO…
G
yes im on a project of making a Ai Model Learning Human Emotions like you can T…
ytc_UgxsDh4yy…
Comment
Interesting conversation. I’d imagine that the presented solution of governments getting together to establish and enforce laws to slow down the development of AI until safety/control has been figured out, would probably work. But only if that’s done soon. Give it 2-5 years when humanoid robots are more developed and deployed, do you think the government will even have the authority to maintain control over these companies? I’d imagine the CEOs would laugh at the thought of being told to stop. “What are you going to do about it?” Just a thought
youtube
AI Governance
2026-01-10T07:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxCWhyNITLoD7lnF6N4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgynBa5rZl2Tdm-R5Rh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxq1ISc4esin8uPi0x4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzlG0L_HGScFq7AcMt4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw2BIo6uU-A6hXrIUp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwa_RY409oMZzZKi7N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyqIqBe5LiBWxlgX-t4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwnZzzmdmruqbDNMm94AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgyBJluWSJ6tkUrODqB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzhBpd5YKmxECTov_14AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"outrage"}
]