Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Intel's quarterly results underscore challenge in mobile

SAN FRANCISCO - Intel's mobile and communications group took in a scant $1 million in revenue in the third quarter, underscoring the challenge the top chipmaker faces expanding into smartphones and tablets.

The Santa Clara, California-based company on Tuesday reported third-quarter results that beat Wall Street's expectations, helped by a recovery in personal computers, by far its largest market.

Progress in Intel's smartphone and tablet strategy was less clear. The mobile and communications group had an operating loss of $1.04 billion for the September quarter, worse than a $810 million loss the year before.

The company in recent years was slow to recognize the significance of the smartphone revolution, and CEO Brain Krzanich, who took over in 2012, has accelerated efforts to catch up by subsidizing costs for tablet manufacturers that use its chips.

Smartphones and tablets account for a tiny fraction of Intel's business, but the company sees staking out territory in mobile as key to emerging markets such as wearable computing products.

Intel is using its deep pockets to help it reach its goal to see its chips used in at least 40 million tablets this year, up from 10 million in 2013.

The vast majority of smartphones and tablets are made with processors based on rival technology from Britain's ARM Holdings Plc Britain's ARM Holdings .

Intel's subsidies are meant to reduce the burden to manufacturers of designing tablets with Intel's current chips, which require more expensive memory and other components that it says drive up costs.

Intel says its future chips will be more cost-effective for manufacturers and will not need subsidies in order to sell
“夏则资皮,冬则资纱,旱则资船,水则资车” - 范蠡
China is indeed a tough place for US companies, but China is also a place can't be ignored...

Intel to invest $1.6 billion in China factory

BEIJING - Intel Corp will invest $1.6 billion to upgrade its factory in the city of Chengdu in western China, the latest sign of how the chipmaker is deepening ties in a market that is proving increasingly troublesome for some U.S. technology peers.

As part of the upgrade, Intel said in a statement on Thursday it would bring its most advanced chip-testing technology to China. In exchange it will receive local and regional government support for construction.

"Deploying our newest advanced testing technology in China shows our commitment to innovating jointly with China," Intel executive vice president William Holt said in the statement. "The fully upgraded Chengdu plant will help the Chinese semiconductor industry and boost regional economic growth."

The announcement comes three months after Intel purchased a minority stake in a government-controlled semiconductor company to jointly design and distribute mobile chips, an industry that China considers to be of strategic importance.
“夏则资皮,冬则资纱,旱则资船,水则资车” - 范蠡
Intel was previously late in portable device market. Will Intel late again for the wearable smart device market?

Intel extends push into wearable smart devices

SAN FRANCISCO — Intel Corp chief executive Brian Krzanich showed off a computer built into a jacket button and a wristband that transforms into a selfie-snapping flying camera, as the chipmaker extends its push into smart wearable gadgets.

Speaking at the Consumer Electronics Show in Las Vegas on Tuesday, Mr Krzanich used most of his keynote to talk up Intel’s efforts in computerised apparel and other sensor-packed gadgets — nascent markets that the chipmaker and other technology companies hope will fuel future growth as demand for smartphones and tablets loses steam.

Curie, a new button-sized computer for smart clothes, is due out later this year and includes Bluetooth radio as well as the latest from Intel’s Quark line of low-power chips. Intel’s chips so far have not made significant inroads into wearable gadgets such as fitness bands or smart watches.

“With this product, they can deliver wearables in a range of form factors,” Mr Krzanich said of Intel’s manufacturing customers.

“Rings, bags, bracelets, pendants and yes, even the buttons on our jackets.”
“夏则资皮,冬则资纱,旱则资船,水则资车” - 范蠡
Intel is always lagging to catch the next market changes. It is still "stuck" with PC market...

(not vested)

Intel forecasts disappointing revenue; shares fall

Chipmaker Intel Corp forecast current-quarter revenue and gross margins that disappointed investors, sending its shares down more than 2 percent in extended trading.

Revenue from its mainstay PC business fell about 3 percent to $8.9 billion in the fourth quarter from the third, raising doubts about the expected recovery of the personal computer business.
“夏则资皮,冬则资纱,旱则资船,水则资车” - 范蠡
Intel sales forecast is consistent with IDC outlook...

Intel Cuts Sales Forecast on Lower Demand for Computers
(Mar 12): Intel Corp., the world’s largest chipmaker, reduced its first-quarter sales forecast, citing lower-than- anticipated demand for corporate computers and weakening economies, particularly in Europe. The shares fell as much as 5.4 percent.

Revenue is now expected to be $12.8 billion, plus or minus $300 million, the Santa Clara, California-based company said in a statement Thursday. The previous outlook had been for $13.7 billion, give or take $500 million.

IDC widens global 2015 PC shipments outlook to 4.9% decline
SAN FRANCISCO (March 13): Worldwide personal-computer shipments will fall further than forecast this year as the strong US dollar and the lack of new products threaten sales, researcher IDC said.

Shipments will drop 4.9 percent in 2015 instead of the previous forecast for 3.3 percent decline, the Framingham, Massachusetts-based research firm said in a statement.
“夏则资皮,冬则资纱,旱则资船,水则资车” - 范蠡
A valuation of P/Sales (Revenue) of nine (9), and base on forward than historical revenue, is only applicable to technology stocks, I guess...

Data Center biz, is getting lot of attention from market. It became a buzzword nowadays...

(not vested)

Intel buys chip maker Altera for S$22.6b

SANTA CLARA (CALIFORNIA) — Intel Corp has agreed to buy Altera for US$16.7 billion (S$22.6 billion) as the world’s biggest semi-conductor maker seeks to make up for slowing demand from the personal computer industry by expanding its line-up of higher-margin chips used in data centres.

By combining with Altera, Intel will be able to bundle its processing chips with the smaller company’s programmable chips, which are used, among other things, to speed up web searches. Santa Clara-based Intel said on Monday it would offer US$54 per share for San Jose, California-based Altera, a 10.5 per cent premium to the latter’s closing price on Friday.

The deal valued Altera at about nine times forward revenue, Thomson Reuters data showed.

“It seems very high to me. The last one I remember that was close was Broadcom buying NetLogic at eight times forward revenue, and that didn’t turn out very well for Broadcom,” said Stifel, Nicolaus & Co analyst Kevin Cassidy.
“夏则资皮,冬则资纱,旱则资船,水则资车” - 范蠡
Watch Out Intel, Here Comes Facebook
By Tiernan Ray

2676 words
31 Oct 2015
Barron's Online


For 50 years, the computer-chip industry has ridden a road of economic miracles, but now it's reaching an end, and a potential crisis is unfolding for some of the industry's biggest names.
The phenomenon known as Moore's Law, which held that the number of transistors on a chip doubled every two years while the cost fell by half, led to vibrant markets such as the personal computer, with ever-rising performance and falling prices. But Moore's Law is now breaking down, and the physics of chips is becoming treacherous. Approaching the atomic level in size, transistors have started performing less reliably, making them less efficient. Intel and other semiconductor makers have had to add extra manufacturing steps, increasing costs.

“The semiconductor industry is up against an economic problem," says Linley Gwennap, a longtime chip-industry analyst. “Now that you can't automatically reduce the cost to get the same performance, Intel and the rest are stuck trying to justify the investment."
The industry's response has been to circle the wagons. This year, there has been more than $136 billion worth of announced semiconductor mergers and acquisitions, according to Bloomberg, an unprecedented volume. Companies are busily streamlining costs and merging portfolios to eke out profit, rather than investing in new innovation.
At the same time, another even more important challenge is brewing for traditional chip makers: the cloud, or Internet-based computing. Driven by the need to analyze massive amounts of data, the Goliaths of the Internet—Alphabet (ticker: GOOGL), Facebook (FB), (AMZN), and Microsoft (MSFT)—are altering the very nature of computing. In the process, they are pressing at the boundaries of chip technology and forcing a major change.

The economic impact of the new direction in semiconductors—chips that think rather than simply compute—could lead to a renaissance in the industry. As a result, the coming decade may be one of the brightest for chip investors. The implications go far beyond Wall Street's near-term view.
One major beneficiary could be Micron Technology (MU). Its memory chips could play a key role in cloud chips, which will rely less on processing power than on vast amounts of memory. The only pure-play memory-chip maker, Micron traded last week at $16.90 a share, less than half its 52-week high, reflecting a drop in the price of commodity memory chips. In the next few years, the stock could rise by a multiple of that price, if the industry develops as we expect.
Broadcom (BRCM), a maker of networking processors, could benefit with chips that search for information instead of just calculating. The same holds for makers of graphics chips, like Nvidia (NVDA), whose components make possible the mining of vast amounts of information.
Intel's (INTC) dominant franchise in chips that power server computers is threatened by the new wave, but the company could still end up coming out all right by playing to its strengths in memory and programmable chips, and its deep relationships with providers of cloud computing. Qualcomm (QCOM), too, will need to adapt. Long the dominant maker of communications chips, it might need to bulk up on memory chips to stay relevant.
ALPHABET (FORMERLY GOOGLE) and its peers are solving some of the biggest computing challenges ever attempted, such as automatically translating words into different languages, and understanding human speech. The company aims to make image recognition more accurate, and has made dramatic progress in recent years in improving computers' ability to understand English-language commands with fewer errors.
Facebook has a team devoted to refining machine translation of languages. English isn't the native language of a large portion of Facebook's 1.5 billion users. Providing them automatic translations of posts is a priority. Amazon's AWS and Microsoft's Azure cloud-computing services would like to offer more and increasingly sophisticated services, such as analysis of big data for clients.
All of these companies have hired away many of the leading lights in neural networks and related fields, and have assigned to machine-learning projects their most talented engineers, such as computer scientist Jeff Dean, who advises Alphabet's research and machine-intelligence effort, which the media has dubbed Google Brain.
Their work draws on traditions of artificial intelligence, but they don't aim to crack the mysteries of the brain. Instead, they seek to create useful products and services loosely inspired by neuroscience. Their efforts depend on feeding a computer millions of data samples and training it to recognize patterns. That heavy lifting is already changing the nature of computer-chip usage.
The Internet's server computers, which dish up search results, Facebook “likes," and all the rest, traditionally have used microprocessor chips, a technology dominated by Intel. But the new machines performing A.I.-like tasks increasingly employ a different chip, called a graphics processing unit, or GPU, to boost performance.
Once the domain of videogames and scientific computing, GPUs excel at crunching clusters of data simultaneously. Alphabet and the rest are buying them like they're going out of style. As a result, shares of the primary supplier of GPUs, Nvidia, are up 50% in the past 12 months, although shares of another supplier, Advanced Micro Devices (AMD), have fallen sharply due to company-specific problems. If demand plays out as we foresee, it could lift the fortunes of both. They might even become takeout targets of companies such as Qualcomm and Broadcom, or the cloud giants.
THE VERY NATURE of most chips is archaic and in need of rethinking. Most computers today—from smartphones to giant database computers—are based on a design laid down by famed mathematician John von Neumann in the 1940s. Von Neumann theorized that a computer should be a machine used mostly for calculating, and stuffed with circuits for doing math. These so-called logic circuits should have just enough memory circuitry to store the instructions created by humans.
The von Neumann Machine, as it is known, has become obsolete. Problems tackled by projects such as Google's image-recognition system might require only simple math. What they really need are huge amounts of memory. The solution: Flip the equation. Make chips more about memory, less about logic.
One proponent of this approach is Jeff Hawkins, who founded Palm Computing, maker of the Palm Pilot, in the 1990s. Today, he runs a start-up called Numenta, whose mission is to reverse-engineer the brain's neocortex, to find principles of the human mind that will inform the design of intelligent machines.
On a recent morning, Hawkins demonstrated an app created using his technology. Servers monitoring the flow of Twitter messages about Wal-Mart Stores (WMT), as well as the stock price and trading volume, flashed a warning of an anomaly: a sudden spike in tweets preceding a sharp drop in Wal-Mart's shares. That morning, Wal-Mart announced it would raise minimum hourly wages, sending its stock plunging. The tweets were the canary in the coal mine.
Using Numenta's algorithms, models can be built tracking hundreds of thousands of data feeds simultaneously—for derivatives, currencies, and numerous other domains—as a kind of early-warning system for markets. But to do that, “we've got to have hardware," says Hawkins, meaning specialized hardware that goes beyond Intel microprocessors and even GPUs.
“We've had 70 years of programmed computing," says Hawkins, referring to the von Neumann Machine. “The next 100 years will be one of memory systems."
IF THE FUTURE is full of memory, Micron might have some of the most valuable assets in the industry. Last summer Micron announced a new kind of memory chip, the 3D XPoint, in partnership with Intel. It's a thousand times faster than the flash memory chips that store files in an iPhone, and holds 10 times as much data as DRAM chips that are the main memory in PCs.
XPoint might be a perfect combination of the speediness of DRAM and the storage capacity of flash, making it more economical than either one. But while XPoint's announcement stunned the industry, experts are skeptical because Micron hasn't revealed the special semiconductor materials it uses.
“To a certain extent, the value of the technology is purely related to the materials, so we're intentionally being vague because we really don't want our competitors to know exactly what it is we're up to," says Micron's vice president of research and development, Scott DeBoer. What is profound about XPoint is less the chip itself and more the fact that Micron seemingly has perfected a way to combine memory with logic, a task that has been attempted over the years with little success.
The benefit for Micron is obvious. Its DRAM and flash memory today are commodities. Chips merging logic and memory in novel ways would be unique and carry higher profit margins.
Others are aware of this paradigm shift in memory, such as Broadcom, which specializes in communications chips. It is merging with Avago Technologies (AVGO) in the industry's largest-ever deal, valued at $37 billion. Designing “dedicated processors" for the cloud-computing companies “is going to be an interesting challenge," says Henry Samueli, Broadcom's chief technology officer. “It will look different from the traditional compute processor, because it's not so much calculations that you are doing; it's more about searching."
Broadcom's chips for computer networking, which the company already is shipping, are relevant, he says, because they are in effect “a massive memory chip."
Where this leaves Intel isn't clear. The server market is the healthiest of its markets at the moment. But if the shift is toward GPUs, and ultimately toward a new kind of memory chip, Intel's server sales could suffer.
Intel isn't unaware of all this. It has been working with the cloud companies since 2007, and has “numerous Intel fellows" pondering the problem of machine learning. Half of Intel's Xeon shipments to cloud companies are custom chips, says Intel's data-center chief, Diane Bryant, so they can be tailored to different approaches to neural networks. If a cloud company is still working out the math, Intel's programmable chips can be fine-tuned. “That was one of the big vectors in our acquisition of Altera," she says.
Intel agreed to buy Altera in June.
Intel also has a competitor to GPUs, called Xeon Phi, which aims to take market share from Nvidia. “It's a small portion of our business today," Bryant says of machine-learning projects, “but it's exciting, and it's clear that that is where the puck is going. My job is to win the whole thing."
Intel, too, has a best friend in Microsoft, which is willing to work with the chip maker on innovations. Says Mark Russinovich, Azure's chief technology officer: “We are both finding ways to move the computing operation closer to the data, and also finding ways to more easily move the data to the computing operations."
The bigger risk for Intel, Micron, and others is that if they don't come up with the goods, the cloud companies may do it themselves. “There is a competitive advantage to putting their algorithms into their own silicon," says Gartner's Martin Reynolds of the potential for custom cloud chips.
The cloud companies wouldn't need factories, just the hundreds of millions of dollars it takes to design chips. Some other company, such as Taiwan Semiconductor Manufacturing (TSM), would produce the actual product. Cloud companies have billions to spend in their quest to gain a competitive edge.
Sources throughout Silicon Valley, including those who have spoken with Alphabet's Dean, believe that Alphabet is working on chip designs, although no one is sure what they are. “The limits we are hitting are memory bandwidth," says Facebook's hardware guru, Yann LeCun. “The way to get performance is to organize the memory, and put it closer to the [part of the chip that does the math]." He notes that Facebook has commissioned custom chips in the past, for tasks such as handling communications in its server computers.
Facebook is receptive to Intel or another vendor making its own neural-network processor, but so far no one has delivered. “If they don't, then we'll have to go with an industry partner who will build hardware to specs, or we'll build our own," he says.
THE NEXT GENERATION of cloud companies isn't waiting for the chip industry to catch up. Near Qualcomm's headquarters in San Diego, two neuroscientists, Naveen Rao and Amir Khosrowshahi, run a start-up playfully called Nervana. They are designing their own chip that can analyze vast amounts of data, but they're not looking to sell it. Instead, they'll use it as the secret sauce to provide data analytics for a cloud-computing service.
Here, too, the memory revolution is evident. Their chip has thousands of little logic engines sitting next to pools of memory, with a plethora of connections between them. They claim the chip can perform in a fraction of the time an image-recognition task that would take an Intel server chip 2,000 hours, or an Nvidia GPU, 33 hours. The key is that they are designing for specific algorithms, so the chip's design is more efficient than a general-purpose chip like Intel's or Nvidia's.
The cloud really kicks in when those same types of chips move down into devices we use every day, making them intelligent machines. “There are cars driving around Silicon Valley right now with a guy sitting there doing nothing, not even touching the steering wheel," says Aart de Geus, who runs Synopsys (SNPS), a vendor of software programs used by virtually all chip companies to design their wares. “When you add smarts to the Internet of Things, you will have the reality of artificial intelligence," he predicts.
That is starting to happen already to cars and flying drones and other devices with circuits customized to a task. Chip designers Mobileye (MBLY) and Ambarella (AMBA) are two good examples. Both companies start by figuring out which mathematical algorithm they are trying to solve for a given problem, such as analyzing reams and reams of video.
Says Les Kohn, Ambarella's chief technologist: “Computer vision will be the next stage for things such as drones, where it enables the drone to pilot itself with minimal or no human intervention. For that, you need something that goes beyond a von Neumann Machine, a processor designed specifically for the algorithms of the problem at hand."
Mobileye's chips analyze multiple video feeds from cameras on advanced cars such as the Tesla. Their circuitry has sections devoted to rapidly “digesting" large amounts of video data about the road. “We have the ability to give the car a very deep neural-network ability, an ability to understand the entire scene on the road as you drive," says Itay Gat, Mobileye's head of R&D.
If the prior generation of chip makers got lazy from making easy gains with Moore's Law, a new generation is bringing back innovation. Dan Armbrust, who used to run chip manufacturing at IBM, has founded Silicon Catalyst, an incubator for chip start-ups. His companies are inventing amazing things. One of them, Ayar Labs, is fashioning on a single chip all of the components that drive a fiberoptic connection.
Another, Silicium Energy, uses silicon to convert your body heat into battery power that could run your smartwatch or other wearables for days, weeks, or years. In an industry of old-timers, these companies' founders, remarkably, are all under 40.
“The idea that just riding Moore's Law is the ticket to the jackpot is less and less true," Armbrust says. “And that opens a door to innovation again, bringing Silicon Valley back to its roots."

Dow Jones & Company, Inc.
Intel first major mobile win from apple!
Looks like intel is also jumping on bandwagon, making a big move into autonomous driving, taxi driver may be an occupation of the past ;D

Intel buys Israeli self-driving car firm for $15 billion
Virtual currencies are worth virtually nothing.
The biggest losers will be the young people who are uber/grab drivers. Not only do they risk being obsolete from driverless cabs, they also face career risk by not moving up the value chain. Who will employ a 30 something with no skill?

And when these ride-hailing companies finish burning through their cash to offer discounts to passengers, people will return to the other modes of transportation. Of course, uber/grab is betting on commuters becoming dependent on them, such that there will still be demand for their services when there is no more discount. But if this does not pan out, and the government's initiative to reduce car ownership is not as successful, this may lead to less income for uber/grab drivers.

Forum Jump:

Users browsing this thread: 1 Guest(s)