Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I’ve never even heard from the school and I live right outside of Boston like a …
ytc_UgxhuTbnw…
G
AI reminds me when in the 80’s they told us that by the year 2000 they will be f…
ytc_UgxSK5af-…
G
AI’s pattern recognition is not the thing that needs fixing. 20+ years of FBI cr…
ytc_UgwTgSNgA…
G
Eric Schmidt argument that ai that destroys humanity is bad for capitalism is so…
ytc_UgxOhLihi…
G
You need an AI that can think like a senior engineer (planning architecture), re…
ytc_UgyqrrjqW…
G
So far self driving cars have killed a pedestrian. Also, a user was able to find…
ytr_UgzyoQHfk…
G
The best part of their hiring AI is that it sets up an interview for you and whe…
rdc_n0m2w41
G
"She thinks her art is safe" >:(
What kind of cartoon villain ahh sentence is th…
ytc_UgzUxiHS6…
Comment
For once you covered a subject that I am well versed in. The fear of AI like ChatGPT is all overblown. It is just gathering data from the internet and giving a response based on what other humans of said. It’s fun to speculate and fantasize about such things. It giving us responses that it thinks is right and want to hear.
The AI you should be worried about is the AI no one knows about. The AI’s that are being developed in secret…
ChatGPT is just a fun toy. A cheap party trick
youtube
AI Governance
2023-07-10T23:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgxCyGX2o_4MDn41KQh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgwYPZkHnSmV4h6awKR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},{"id":"ytc_UgxXqhZaqOW8d80Lf9B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_Ugx5Wu7n6v9DMXA-dbt4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"},{"id":"ytc_UgxJ2SePE-8zKJm2C8d4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},{"id":"ytc_Ugy-5SuasVaj5CUEj5F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},{"id":"ytc_UgyT4JkDdBEuafZkBxh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},{"id":"ytc_UgyF3G6kPgKjEU4OQV14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_UgxcgZDoex5G6uF6v4N4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"indifference"},{"id":"ytc_Ugx7cK4qZof-YY8wy594AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"})