Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI will continue to fall into the category of human skepticism as long as it fai…
ytc_Ugwnyft9u…
G
AI is going to kill Google like Google killed the Yellow pages. What did anyone …
rdc_m27ymg1
G
Arby's was already bad, but they replaced their lively commercials with AI. I kn…
ytc_UgxDKD3q4…
G
It’s simple asking AI to create a video of trump sticking his head up his own as…
ytc_UgwZi_wjJ…
G
For shoplifting? Really? Aren't there any more serious crimes they could use the…
ytc_UgzJTFVEZ…
G
artificial intelligence was instructed to aim towards russia, china, and india b…
ytc_Ugz2ISQMI…
G
As much as this is stupid, probably was the only way OpenAi didn't go bankrupt.…
rdc_o7w3q82
G
Said this for years av worked on building new amazon's and this is only the begg…
ytc_UgxVzUCNO…
Comment
AI could fly planes and subs, but if something goes wrong, its gonna try everything in the hand book to fix it(even if its a new problem. humans have the ability to think outside the box with our imagination to fix it. For now
youtube
AI Governance
2024-05-23T23:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxsD5jfyViQLj5sWcZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxGdBSsfowTGGPsUup4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgynXauCH7yq4tn0IcB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyyzV7S70TZzfACRoV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy7VtzT_AtKDu5TUYJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwNNajhdXxW6nnThvh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxc7pgFpg8V46At66d4AaABAg","responsibility":"government","reasoning":"unclear","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgyEpRZhcfAhSOoroDt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyDLw_cLbrle9MHvM14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgydiMB34WilEsB0XS94AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"}
]