Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So I also do this cause it feels natural 😂 but i have once heared a friend sayin…
ytc_UgzWJw4H0…
G
That's why I don't accept the world accepting robots side by side with humans, b…
ytc_UgwIsqP1X…
G
I have told people for years that all those things "sentient AIs" says to the pu…
ytc_Ugxo3DxSO…
G
This is why Zuck and Musk are using illegal energy turbines to race for ever lar…
ytc_UgwOec65R…
G
Skynet is now our reality. AI is in everything now. Just a matter of time before…
ytc_UgzsWeas2…
G
Have no problems with HONEST TRANSPARENT Non-Artist sellers using AI... Just the…
ytc_Ugzs1X-B7…
G
In america right now we're super pissed about a gorilla so I don't know if anyth…
rdc_d3rjmla
G
This is the future. What are we supposed to do? Stop the production of robots an…
ytc_UgxS4-wMl…
Comment
I use AI when I do a search and want a more in depth answer, and then asking copilot is even a more advanced search for an important question. On the other hand if I am just looking up the meaning of a word I use the normal search engine for that. I know AI should not be used for medical questions but so far AI has provided accurate answers. As of today, its all good. In the future, well who knows?
youtube
AI Governance
2025-09-21T12:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugz75BRy43mxZOrXEu54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx3HQwFppCl1j1Zha14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgweCBjWGgPoWFNaOyR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugye7H2e2tZZ8uxGzx54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx2sBiRspZRqpAvUMp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwoX5GrLveiF9UZPPt4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxgcE43MvydC4pYifV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw22vRnyjYiGy9ZDGp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx3yjrnX4DHTwySuV14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgweAu6DIW3Xa7xP5QZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]