Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Person: AI, tell us the truth, stop with the lies
AI: Tells the truth
Person: HO…
ytc_UgxX1oASY…
G
'I have a vest on. If I had no arms, it would be a jacket.'…
ytc_Ugwqzk6HX…
G
Behind the “mask”…
“Human” behavior…
A “reflection” through “code”…
“Humans” ca…
ytc_UgxOXB3FK…
G
al voice is old school , this happened like 10 years ago now they have ai video …
ytc_UgyRGxe9O…
G
That's the problem this days .ie: some people come up with names for ai to wri…
ytc_Ugyo9WJ6c…
G
AI is mimicking us. Like a child mimic's its parents.
It regurgitates our own f…
ytc_UgwjHTJq3…
G
I get it, your version of inspiration means tracing - sad. as an 'AI' user, your…
ytc_Ugy7mMNzL…
G
Reporter: Would the lights in this warehouse function without generative AI? Doe…
ytc_UgxKl4m6Q…
Comment
Im a security guard. There are laws in place that prohibit robots from using lethal force independently. This is job security for me. But truth be told if you think the right software wouldnt be superior to the typical security guard who has a week of training and is tired and bored and distracted by their phone youre a fool. I think that perception of safety will change just like it is with self driving cars.
youtube
AI Governance
2025-09-04T15:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugx2G8A1PclnnLy8mrF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugzy46KoLKUN4_4MgJh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyBePj6p2y7tZ-Ja4Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz-p-7CKbBFCxD0foZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyBBmdyVTU1siTLWC94AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxJTee0yltLurhxuo54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzSBim8DZ0aDq7iTo14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz54NWoUmTUyF8ZHO14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgxQ_wV34IjcnFZbMp14AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzpF-Rgedf1NoXyGfh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}
]