Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What's going to happen is the role truck drivers fill will change in the areas a…
ytc_UgxQU0wA7…
G
But why???
Why don’t we just limit these tools to things like medicine, healthc…
ytc_UgxN6n5AF…
G
The good news is we don't have to worry about climate change, since we're all go…
ytc_UgwSseY9h…
G
Anything that can automatically drive you can automatically kill you. Imagine a …
ytc_UgwmNiShg…
G
I think it would be more ethical if the artists it takes inspiration from were p…
ytc_UgwimIhQ3…
G
Asimov gave humanity the hard rules to be programmed into AI. Humanity did not d…
ytc_UgzVb9zz4…
G
Pretty scummy admin. I wonder how they would feel if they had AI images of them …
rdc_nvr6g12
G
This is truly the main issue when it comes to regulating AI. Even IF we can get …
rdc_k0aqnjd
Comment
I feel like AI can be fun for personal use. I can install ChatGPT To run on my computer and depending on how much power I give it, the quality increases. So I can make something like a little chat bot. Or code bullet that made a really fun series using AI(just for fun) and actually did a lot of work to make it look presentable. For now, AI is only good for that. I never needed it in any case other then doing my homework(which, let’s be honest, isn’t a good thing). I really hope we stop using it the way we do right now until we are actually „worthy“ and ready for that technology
youtube
Viral AI Reaction
2025-04-05T15:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | virtue |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgznLpzWnPH9tYHQ3Kx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxKvNGDeK6Mca2BHcl4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzdf6aOwS1aVugCOdl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxaQT8Hv2rBiV-EeFl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxEPlI1TiAHbOPMt3l4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzFYjskcfWqfFTglpR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyXBylTCCTN9QfOWtB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwX1VHzDndzuVucBMR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzTrgqkZdFGfQvP9cl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxPTbIA8F6Ubq_2MBh4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"mixed"}
]