Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Nuclear reactors, good point, imagine where you’d be if only USSR had them 😂. On…
rdc_je3p00e
G
And i hate pro-ai and artist haters, because they are so lazy they think what th…
ytc_UgyzGDcFE…
G
I'm just over here trying to figure out who the hell is using the character ai b…
ytc_UgzGPkCWc…
G
Plenty of jobs are not on computers. I'm a plumber. AI isn't coming for my job. …
ytc_UgyKYJm9P…
G
A.i in the movies: "we'll kill you and replace you humans."
A.I irl: "how can I…
ytc_Ugy77a-uW…
G
For a very long time, there was nothing more dangerous than an agent of the stat…
ytc_UgyQbc8tY…
G
The job loss is not something to worry about in my opinion. Technology shouldn’t…
ytc_UgzOXNYKM…
G
BUNCH OF MONEYHUNGRY GREEDY IDIOTS, U CANT CONTROL A SUPERINTELLIGENT AI, THESE …
ytc_UgyPOaGUj…
Comment
Why doesn’t some of the Hollywood producers and directors create a movie/s about AI and the estimated progression/destruction of the world?
People love a good movie, and as well as the “entertainment” (horror?) value, it would also educate the masses on what might happen if we all stay on the same track. I know we have the terminator series, but maybe something slightly more realistic (although I guess killer robots are not out of the question).
As we (the people of the world) move forward, we need to ensure that governments (our governments) around the world stop/slow the development to structure AI in such a way that it doesn’t wipe us out.
I’m all for advancements in education, medical etc. I’m just concerned (like many) that AI will become a real problem in one or more ways and we will regret ever going down this path.
youtube
AI Governance
2025-07-22T22:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwqkHsrcA8KLfJ_ggN4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyYRbkcXLRcog5OkEd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx16MIn8-WgZWB2RUN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxkeyJO2zrZq-zOFBx4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwLTgdxI8Z09S9NPIZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx_aOGH01Def16O4lB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxtffePsGnEI5G_YIN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxAgLji_cuRyM684Sl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxT_PgGXOAEAtbfYpB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"fear"},
{"id":"ytc_Ugxb15q3Bx29kr2BhYJ4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]