Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
And what's creepy about all this? AI is already coming to the conclusion that th…
ytc_UgwHn6egH…
G
I'm not to big on robots that look like humans. That's a dangerous line to cros…
ytc_Ugx245DAi…
G
Even if something catastrophic happens, leaders who are indoctrinated, uneducate…
ytc_Ugyv3omSp…
G
I'm not saying art poisoning is a bad idea, but tech bros are paid to take pride…
ytc_Ugwid0-JE…
G
@Baz927 I completely agree with you, you can't stop a pervert on a mission 😈. Pu…
ytr_Ugy9N0Uq-…
G
Human driver 100% would have killed that woman. Everyday that we don't promote t…
ytc_UgwlJMt37…
G
I am pretty sure that most lines she say are writen since she knew where a robot…
ytc_UgwRz9LB6…
G
AI is dependent on considerable hardware, and considerable power. While little f…
ytc_Ugz6v90t4…
Comment
I'm good at computer networking terrible at scripting, or programming, I got like a C in scripting in college just got a 2year AAS like a decade ago. Anyway I use chat GPT 3.0 or whatever the free one is all the time to help write scripts for me, and they don't always work right away, but like this guy said it will keep changing it and walking you through to the end. by the time chat GPT 6 or 7 comes out it's going to be insane, it's already borderline dangerous. I'm 100 percent sure that programmers aren't going to be needed in 5 years 10 at the most. It will be like 1 really good programmer overseeing AI that writes all the code gauranteed. so all those "LEARN TO CODE" people that were giving the truckers crap are going to be in trouble shortly.
youtube
2024-05-10T21:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxOoa4FixJpMwwykFh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwbTffeyqRVjB9IRtt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzJUYkVej3QLEHaGz54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw5AtRpQx9GGlC3XbR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw-UVakVVQMoBoX6Mh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx2ztaVpwAnjqbAWgt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugww57KxxAYLwN4m8RR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwSK9D5K5WHkjTXiJh4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyvbulXDmxAGY2qU3R4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzpKz6c7Gv_AsM-yaZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]