Photonics is a tough nut to crack.

Startup Stories

[ad_1]

Photonics is a tough nut to crack.

The growing calculation The power necessary to train sophisticated AI models like OpenAI’s ChatGPT may eventually hit a wall with mainstream chip technologies.

A 2019 analysis found that the amount of energy used to train AI models doubled every two years from 1959 to 2012, and that energy use increased sevenfold after 2012.

It is already under pressure. Microsoft is reportedly experiencing an internal shortage of the server hardware needed to run AI, and the shortage is driving up prices. CNBC, speaking to analysts and technologists, estimated the cost of training a ChatGPT-like model from scratch at more than $4 million.

One proposed solution to the AI ​​training dilemma is photonic chips, which use light to send signals instead of the electricity used by conventional processors. Photonic chips could theoretically lead to higher training performance because light generates less heat than electricity, which travels faster and is less susceptible to changes in temperature and electromagnetic fields.

Lightmatter, LightOn, Luminous Computing, Intel and NTT are among the companies developing photonic technologies. But while the technology generated a lot of excitement a few years ago — and attracted a lot of investment — the sector has since cooled off nicely.

There are various reasons why, but the overall message from investors and analysts who study photonics is that photonic chips for AI, while promising, are not the panacea they were once believed to be.


[ad_2]

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *