“Microsoft is training a new, in-house AI language model large enough to compete with those from Alphabet’s Google and OpenAI, the Information reported on Monday.
The new model, internally referred to as MAI-1, is being overseen by recently hired Mustafa Suleyman, the Google DeepMind co-founder and former CEO of AI startup Inflection, the report said, citing two Microsoft employees with knowledge of the effort…
MAI-1 will have roughly 500 billion parameters, the report said, while OpenAI’s GPT-4 is reported to have one trillion parameters and Phi-3 mini measures 3.8 billion parameters.”
From Reuters.
“Theoretical physicists at Utrecht University, together with experimental physicists at Sogang University in South Korea, have succeeded in building an artificial synapse. This synapse works with water and salt and provides the first evidence that a system using the same medium as our brains can process complex information.”
From Phys.org.
“Scientists at Intel have built the world’s largest neuromorphic computer, or one designed and structured to mimic the human brain. The company hopes it will support future artificial intelligence (AI) research.
The machine, dubbed ‘Hala Point’ can perform AI workloads 50 times faster and use 100 times less energy than conventional computing systems that use central processing units (CPUs) and graphics processing units (GPUs), Intel representatives said in a statement. These figures are based on findings uploaded March 18 to the preprint server IEEE Explore, which have not been peer-reviewed.”
From Live Science.
“Researchers in Japan have developed a new method for making 5-cm (2-in) wafers of diamond that could be used for quantum memory. The ultra-high purity of the diamond allows it to store a staggering amount of data – the equivalent of one billion Blu-Ray discs.”
From New Atlas.