fbpx
01 / 05
In Pursuit of Progress: Yes AI Art is Art

The Verge | Computing

China Begins Assembling Its Supercomputer in Space

“China has launched the first 12 satellites of a planned 2,800-strong orbital supercomputer satellite network, reports Space News. The satellites, created by the company ADA Space, Zhijiang Laboratory, and Neijang High-Tech Zone, will be able to process the data they collect themselves, rather than relying on terrestrial stations to do it for them…

Each of the 12 satellites has an onboard eight-billion parameter AI model and is capable of 744 tera operations per second (TOPS) — a measure of their AI processing grunt — and, collectively, ADA Space says they can manage five peta operations per second, or POPS. That’s quite a bit more than, say, the 40 TOPS required for a Microsoft Copilot PC. The eventual goal is to have a network of thousands of satellites that achieve 1,000 POPs, according to the Chinese government.

The satellites communicate with each other at up-to-100Gbps using lasers, and share 30 terabytes of storage between them.”

From The Verge.

Our World in Data | Computing

The Length of Software Tasks AI Can Do Is Increasing Quickly

“Before 2023, even the best AI systems could only perform tasks that take people around 10 seconds, such as selecting the right file.

Today, the best AIs can fairly reliably (with an 80% success rate) do tasks that take people 20 minutes or more, such as finding and fixing bugs in code or configuring common software packages…

if developments keep pace for the next few years, we could see systems capable of performing tasks that take people days or even longer.”

From Our World in Data.

Epoch AI | Computing

Amazing Trends in AI Supercomputers

“We curated a dataset of over 500 AI supercomputers (sometimes called GPU clusters or AI data centers) from 2019 to 2025 and analyzed key trends in performance, power needs, hardware cost, and ownership. We found:

  • Computational performance grew 2.5x/year, driven by using more and better chips in the leading AI supercomputers.
  • Power requirements and hardware costs doubled every year. If current trends continue, the largest AI supercomputer in 2030 would cost hundreds of billions of dollars and require 9 GW of power.
  • The rapid growth in AI supercomputers coincided with a shift to private ownership. In our dataset, industry owned about 40% of computing power in 2019, but by 2025, this rose to 80%.
  • The United States dominates AI supercomputers globally, owning about 75% of total computing power in our dataset. China is in second place at 15%.”

From Epoch AI.

TechCrunch | Computing

Google Unveils a Next-Gen Family of AI Reasoning Models

“Google has experimented with AI reasoning models before, previously releasing a ‘thinking’ version of Gemini in December. But Gemini 2.5 represents the company’s most serious attempt yet at besting OpenAI’s ‘o’ series of models.

Google claims that Gemini 2.5 Pro outperforms its previous frontier AI models, and some of the leading competing AI models, on several benchmarks. Specifically, Google says it designed Gemini 2.5 to excel at creating visually compelling web apps and agentic coding applications.

On an evaluation measuring code editing, called Aider Polyglot, Google says Gemini 2.5 Pro scores 68.6%, outperforming top AI models from OpenAI, Anthropic, and Chinese AI lab DeepSeek.

However, on another test measuring software dev abilities, SWE-bench Verified, Gemini 2.5 Pro scores 63.8%, outperforming OpenAI’s o3-mini and DeepSeek’s R1, but underperforming Anthropic’s Claude 3.7 Sonnet, which scored 70.3%.

On Humanity’s Last Exam, a multimodal test consisting of thousands of crowdsourced questions relating to mathematics, humanities, and the natural sciences, Google says Gemini 2.5 Pro scores 18.8%, performing better than most rival flagship models.”

From TechCrunch.