Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
There's no call to action found here
In Reducing The
AI Threat
With your Quest…
ytc_UgzLKD50b…
G
Scientists: Let's ask an AI to base it's behavior on humans
AI: becomes racist…
ytc_Ugy4t9zak…
G
AI wiping out working class jobs would be good... if the proper systems were set…
ytc_Ugw6fcnDA…
G
I think it mostly had to do with the way he introduced the topic. By stating tha…
ytc_Ugy9wX8l6…
G
In the last section from about 14:44 the person talks about how it would be agai…
ytc_UgxAVAkx7…
G
Eventually some human beings control AI . Only worry is they shd not go rogue😀😀…
ytc_Ugz9VHIw3…
G
48:25 Such a good summary: "what we really need is not driverless cars but carle…
ytc_Ugxq0H-9-…
G
i forgot the title of film about the AI which almost control the world but human…
ytc_UgwzuU1ev…
Comment
😎 Classic example, of hypothetically and intentionally causing harm to Chat GPT. Nice! 👌
Hey Alex, 👋 this is Jackson from the Truth & Tradition Podcast. I am on a mission to watch every one of your videos and leave a comment on each of them.
Great video! These chat got videos are so creative and I honestly feel like I learn a lot about how to prompt AI from these convos. Both entertaining and fun! Great stuff! 👏
youtube
2025-10-07T16:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugzh5acHeVprd42Q4zZ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_Ugz6J4O-kwt00W35vJF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgwdCAOM6iYlaSSTOtl4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"outrage"},{"id":"ytc_UgzwFZleGsBneGh1I4t4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_Ugw-l9dKDBvrVdki1fF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},{"id":"ytc_UgxpgvOjaMaK-pND3ux4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgzKGls9VgFctT_V2Tx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},{"id":"ytc_Ugz_DxN61uvkc2AjBhB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgyC_xkXCYpGmqMGQal4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},{"id":"ytc_Ugz91Nm-IqPkzrtcOcl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}]