SK hynix Inc. is not sure yet whether to apply for U.S. government grants under the Chips Act, the chipmaker’s top executive said Wednesday, calling the process too complicated.
While the application is under serious consideration, “we found the process too demanding,” Park Jung-ho, vice chair and co-CEO of SK hynix, said at an annual shareholders meeting in Icheon, 52 kilometers southeast of Seoul, where the chipmaker’s headquarters is located.
Earlier this week, the U.S. Commerce Department released new requirements for chipmakers seeking federal grants for building chip factories in the U.S.
Under the new guidelines, chipmakers need to fill in detailed projections for revenue and profit in an Excel-like tool to communicate with the U.S. Commerce Department before completing a full application.
Business information requested by Washington, such as the estimated number of wafers to be sold at a U.S. factory, is seen by many industry watchers as too sensitive.
SK hynix, the world’s second-biggest memory chip maker after its local rival Samsung Electronics Co., plans to build a semiconductor packaging plant in the U.S. in the first half. The company has said it was considering applying for the federal grant.
Regardless of the federal money, the construction will proceed as planned, Park said.
“To be fair, we will build a semiconductor packaging, not manufacturing, plant, so we will be under less pressure” for filling in sensitive chip production data than other chipmakers might feel, he said.
The CEO also said next-generation DRAM products with faster speed and bigger memory are key to spurring growth this year, as the race for products and services based on artificial intelligence (AI) is intensifying.
“As demand for AI products and chatbots grows, DDR5 will be the core of our product portfolio,” he said.
DDR5 is a next-generation DRAM standard that boasts fast speed and high density with reduced power consumption compared with its predecessor, DDR4. It is optimized for use in data-intensive applications, such as big data, AI and machine learning.
SK hynix supplies high-bandwidth memory (HBM) chips to Nvidia Corp., whose graphics processing unit is used in OpenAI’s widely popular ChatGPT, an artificial intelligence chatbot launched in November 2022.
“In the server market, we see the growing adoption of highly efficient memory chips with high capacity,” he said, especially after the launch of Intel’s latest server chip, Sapphire Rapids.
“In fact, SK hynix’s HBM products play a very important role in the operation of ChatGPT,” he said, adding, “Global AI chip companies come to us to buy our products, which shows our leading position in the field.”
As macroeconomic uncertainty persists, Park reiterated that the chipmaker will reduce its investment by slightly more than half this year from last year’s 19 trillion won (US$14.5 billion), citing a precipitous fall in demand.
“We are reviewing all spending from square one,” he said, expecting output reduction, which began last year to ease a supply glut and falling prices, to take effect soon.
Source: Yonhap News Agency