On Thursday, Alibaba announced that it was developing its own AI chip and joined the tide of developing similar AI chips. Previously, Alphabet (Googles parent company), Facebook and apple giants said they were developing their own AI chips. AI chip: you have, I have, everyone has In April 19th, Alibaba said outside the Damour Institute is developing a Ali-NPU chip, a neural network. This chip will be applied to AI reasoning computation such as image analysis, machine learning and so on. According to the design, the cost performance of the chip will be 40 times that of the similar products. Ali did not reveal the specific development and launch time, but stressed that the chip will be better in the future to realize the use of AI intelligence in business scenarios, improve operational efficiency and reduce costs. Last year, Alibaba hired LiangHan, a Qualcomm employee, as an AI chip architect. Recruitment information shows that Alibaba is seeking to expand more manpower in the Silicon Valley area. Google has similar movements, too. Since 2015, Alphabets internal engineers have been using Google customized tensor processing unit (TPU) to speed up machine learning tasks. Google released the second generation of TPU last year, which could be used to train the AI model, and to deal with more challenging computing, such as the graphic processing unit of the NVIDIA. In February, Google opened its second generation of TPU to the public through cloud computing. In addition, the Apple Corp has built a neural engine element in iPhoneXs chips; Microsoft is developing AI chips for its HoloLens hybrid earphones; Tesla has also been developing AI chips for its cars... Technology giants are independent, betting on their own chip research and development, on one hand to help their artificial intelligence applications run better and reduce costs; on the other hand, they can reduce their dependence on suppliers, such as wtda. Will the business of the NVIDIA be affected? The trend of technology giants developing their own AI chips may eventually affect the relationship between the traditional buyers and suppliers. Reducing reliance on chip suppliers will have more or less impact on the business of NVIDIA. However, at present, the advantage of NVIDIA in chip is still obvious. The chip projects of Alibaba and Google are still at a relatively early stage, which is still difficult to compare with the GPU business of the Vivian data center. In fact, Google and NVIDIA are still partners. The GPU of Google is still running between Google and TPU. Alibaba is still providing the use of GPU by cloud computing, and will continue after Ali-NPU is listed. CNBC, an analyst with MatthewRamsay and VinodSrinivasaraghavan, quoted by Canadian investment bank CanaccordGenuity as an analyst, MatthewRamsay and VinodSrinivasaraghavan, said that with its latest GPU, it has enhanced its confidence in NVIDIA. The sales scale of the data center and the increase of ASIC products in both internal and commercial applications will be more successful in defending its pricing level. Source: editor of Wall Street news and responsibilities: Hou Wei Cheng _NT4124 In fact, Google and NVIDIA are still partners. The GPU of Google is still running between Google and TPU. Alibaba is still providing the use of GPU by cloud computing, and will continue after Ali-NPU is listed. CNBC quoted analyst MatthewRamsay and VinodSrinivasaraghavan of CanaccordGenuity, a Canadian investment bank, as saying: As NVIDIA released the latest GPU, it strengthened its confidence in NVIDIA. The sales scale of the data center and the increase of ASIC products in both internal and commercial applications will be more successful in defending its pricing level.