Our occupations will allegedly all be replaced by artificial intelligence (AI). Well, maybe not all. Recently, researchers gave artificial intelligence (AI) command of their satellite to take pictures of Earth targets, and it selected a path that completely perplexed them.

Either it functions on a level of intellect that is considerably superior than our own, or it doesn't make sense.

According to the story, Chinese researchers granted artificial intelligence (AI) control of a satellite for 24 hours. They described their findings in a South China Morning Post article. Technically, this is against mission planning guidelines, but the researchers readily acknowledge this and don't appear very bothered.

imagem do satélite controlado por IA

The AI, entrusted with directing the satellite's motions and detecting regions of interest on the ground, was not expected to require human assistance.

With full autonomy granted to the Qimingxing 1 observation satellite, we can see inside the AI's head and learn what it finds fascinating.

The team claims that the AI-controlled Qimingxing 1 picked particular areas and carefully analyzed them, however they are unsure of the precise reason behind this.

It concentrated on Patna, a sizable, historic city on the banks of the Ganges River in India, demonstrating a particular interest in that region. It is probable that Patna was picked because of a terrible confrontation that broke out between China and India in 2020, indicating a potential military interest in other regions of the AI, even if the AI was not designed to provide an explanation for its decisions.

According to the SCMP, the AI also examined Osaka, a Japanese port commonly visited by US Navy ships conducting operations in the Pacific.

This is the first time an observation satellite has been given full control by AI without any instructions or duties so far. Although AI is increasingly being employed in tasks like image processing and collision avoidance, the researchers think this was a chance to explore AI's potential even more. However, a thorough trial will probably be required before researchers can fully trust the AI's management because of the significant ramifications involved.

Notably, the AI had complete control over the camera but was powerless to change the satellite's orbit or direction.

The researchers are hoping that their study would prevent money from being wasted on China's 260 remote sensing satellites, many of which are inactive or perform subpar functions.

They propose that AI may be included into systems for surveillance and monitoring, including informing the military about actions.

This hypothesis, however, does not seem too amazing in light of AI's evident keen interest in military operations and historical ground occurrences. Since AI does not currently possess the ability to “think,” it will not be motivated by a sadistic urge for violence to seek out military targets to destroy.

The phrase “AI” has grown so generalized that it might refer to anything from a simple target selection software to something like GPT-4. It is plausible that the AI was taught in an unrevealed period of military history, but it is not clear what exactly this “AI” represents.

Researchers don't know why these sites were picked, and giving complete authority to an AI that can't comprehend people, their nuanced interactions, or anything outside of a training dataset now sounds like a troubling approach.