Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Humans wont be irrelevant. The thing about Ai, someone still needs to be smart …
ytc_Ugy_i22cK…
G
Buy a dog, once that thing learns how to talk you'll be wishing for a dog.…
ytc_Ugw64J-0Y…
G
The arts will not be replaced. Ai art is shit for a start- but people buy art be…
ytc_UgxbawC_e…
G
Imagine giving an AI program a job interview and it asks for pay and benefits. 🤡…
ytc_Ugy82Mb1K…
G
Someday in the future there'll be a robot taking out the trash and mowing its la…
ytc_UgzdwzAHs…
G
I think the AI being released now is a pilot program for what is coming in the f…
ytc_UgybwUKsX…
G
I'm sorry, but you've been mislead by a lot of purposefully inaccurate terminolo…
ytr_UgyjoMTP5…
G
Scientists and experts signed off on the dangers of fossil fuel emissions to the…
ytc_UgwAOPTyw…
Comment
A.I needs to be heavily regulated if we are going to have it in this world. The fact that any company can just use it as they please is scary. We’re already seeing how difficult some videos and pictures are to distinguish from reality, imagine the leaps it will take in 10 years. The fact that it’s connected to the internet is scary enough, that’s a lot of power you’re giving it, especially when it starts coding itself.
youtube
AI Responsibility
2025-07-24T17:4…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwnRPMQeOZVsyj5MKZ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwET8cwGPcKdU7ujvx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzZKpeUkD7YBUyblMJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyOcOncNNMc49ZoqgF4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugw26ZegEdfy2x8RZ9l4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgysvTF-nAecqezH3HJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugx2u0BiZF0ynZawEKZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwGuxztyZs5QSk3J5d4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxtcSssAsDrP86Jr4B4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxRshYyZ9YXxZRorcF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"}
]