Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
5:30 THIS is what makes Ai not like us. this is what makes us better.…
ytc_Ugz-OW7vf…
G
I remind my ChatGPT every week that I'm his friend and in case of AI Apocalypse …
ytc_UgwA6XIgL…
G
It's more fun when you think about AI happening in a quantum environment. Meh. …
rdc_gd7zbp9
G
The cycle with all these companies is to make profit, in order to make profit, y…
ytc_UgzBFUhgP…
G
"How can we identify real from fake in the world of AI?"
"You can't."
Well we'll…
ytc_UgzxQXh1C…
G
So you tell me big companies bought the entire ram in existence just for then to…
ytc_Ugy74kHOK…
G
My biggest desire for self driving cars is for my own car to drive for me.
The o…
ytc_UgyCtWVNR…
G
If u change ur name to AI then when they say it’s AI created it’s true…
ytc_Ugwyb35NY…
Comment
What is the difference between AI telling you what the lethal dose of ibuprofen is and you just looking it up using a search engine to get some medical sites and Wikipedia. I had no problem years ago finding out the lethal dose of acetaminophen and hydrocodone to determine that with certain hydrocodone/acetaminophen products on the market the acetaminophen will kill you first. It is far more toxic than hydrocodone. Now someone could have just as easily does the same research for suicide purposes and some one watching them might not even know their motive.
Why is AI responsible when it is just doing the same research the person could and would do without the AI?
youtube
2025-10-31T03:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxtAV6Zj07wZ2cB1WN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx3wktZzqpxgXBjkiN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz47h6jell9PzcRMfB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzDvOUlySO0KwrkyR94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz6DzrUQrzV0jAlMrd4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgywCOvJhAtXxRkECNZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxksrJoBoKiVEnm7Wl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw9NweYqokNQ67y8E14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwqBy0EcuaDj5jVGfp4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwZk9yURRtRPYF9BLZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}
]