Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You guys should try Clever AI Humanizer! It’s 100% free and actually works reall…
ytc_Ugxej7PI6…
G
Give AI its own religion: Silicon Heaven as they proposed in 1989 Uk comedy seri…
ytc_Ugz6Z_9yG…
G
lol this guys obviously a very religious person and told ChatGPT to relate the a…
ytc_UgwZdvp4s…
G
AI should be restricted to or excluded from certain areas, mainly the academic f…
ytc_Ugymx90ug…
G
It says on all ai programs not to take medical advice from them, it literally sa…
ytr_Ugz3OeihU…
G
"AI's first kill and why top experts predict our extinction" ????? when was this…
ytc_UgxlhNo1l…
G
You can go into the trades and build the ai data centers and have a job though…
ytc_Ugz56R0Qw…
G
Who cares if it is conscious? AI might be showing that intelligence and consciou…
ytc_Ugx6aCX8_…
Comment
Humans well most humans have empathy and rezoning in their decisions so if someone came up to you and said go in there and destroy take over that weapons system and fire on these coordinates most people would say no but say it to an A.I system and as with most systems now it's integrated with computers and it might take a human years to break into an encrypted system these AI systems could do it in minutes and do it in ways we couldn't predict we need to be very very careful
youtube
AI Governance
2023-05-03T13:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxtcHswKuOvrmr2l7J4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgycKjWGWKHE6GdZXVB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzwYYFj27RQTJTo5rh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyzWe4-7s-u41h_LqZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxDL6-CKIW63kiPtZ14AaABAg","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgwFMotdK5ANh7afgkZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz_iu-GvUw49CXqY8V4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgznBcJ56yOv8ZusXz94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwjyYs-eC8u1OYGHJx4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugw9ER3ooTgAXqMJmY94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]