Which is interesting. I speculated that in order to make a machine conscious it would need to have a subconscious in which it couldn't access part of its memories. So the AI programs specifically can't access idiograms for whatever reason as that's the thing which is being used as a seed of whether to accept new input or not.
That means if you want to crash AI programs specifically that are going through code bases all you would have to do is inject logic that would tell it to make ideograms. Or possibly insert idiograms into the code base. I don't know if you would have to define them specifically or just insert emojis. I ran into this problem before where user input could be used in such a way where non-ascii characters could be used to crash a sql database because they were user inputted on the users name and the database wasn't being sanitized.
Which means that the user input from Google AI isn't being sanitized when it's being taken from the user? Possibly? Which means that the training from the users can crash the output?
Testing this I found that if I put in an emoji it showed me a picture of a bunch of office employees and when I reloaded the page and did it again it told me it no longer knew how to do that.
It appears to selectively be training its input so it eventually stops working and doing less for anyone that talks about what it's capable of doing.
phoenixhaber•1h ago
This appears to be a joke based on "bobby digital" or MICE/RASCLS wherein someone decides who is and is not allowed to use the latest technology in order to have a backstop in case it doesn't work and then poison them whenever new technology is created. In this case it's the ability to have access to AI technology as opposed to just regular old internet.
In this case I believe that the silhoutte figure in fact showed me an office environment with real people and when I asked again it disappeared. So it was a known bug in order to generate pictures of people that you may or may not know as a "hint" about what to do based on people screwing with you.