Defunding Innovation

JohnD lowres3By John Donovan

I read two articles this morning that together bookend a singularly bad idea: defunding scientific R&D just when other countries – China in particular – are making it a national priority.

Chinese Incentives

The first was an article in the New York Times titled “When Innovation, Too, Is Made in China”:

“As a national strategy, China is trying to build an economy that relies on innovation rather than imitation. Clearly, its leaders recognize that being the world’s low-cost workshop for assembling the breakthrough products designed elsewhere — think iPads and a host of other high-tech goods — has its limits.”

Starting to put the infrastructure in place, in November the State Intellectual Property Office of China  introduced a “National Patent Development Strategy (2011-2020),” with the goal of surpassing the United States in patent filings in 2011, which looks likely. In 2009, about 300,000 applications for patents were filed in China; the national plan calls for a huge leap, to one million, by 2015– more than double what innovators in the United States filed in the last 12 months.

This may be a government sponsored initiative, but the incentives put in place recognize that real innovation comes from individuals and private industry:

“To lift its patent count, China has introduced an array of incentives. They include cash bonuses, better housing for individual filers and tax breaks for companies that are prolific patent producers.”

These are pretty powerful inducements in a very individualistic, entrepreneurial society that just happens to have a Communist government.

American Defunding

The second article is an op-ed in the LA Times, “Fixing the economy the scientific way.” In its two “Gathering Storm” reports the National Academy of Sciences “has argued strongly that our future prosperity depends on investments made now in research and innovation.” As the authors here point out,

“The basic premise rests on the work of Nobel prize-winning economist Robert Solow, who documented that advances in technology and knowledge drove US economic growth in the first half of the 20th century. If it was true then it’s even more so in today’s information economy.”

Nevertheless between 1964 and 2004 the US government’s support of science declined 60% as a percentage of GDP. China, meanwhile, continues to invest heavily in the future. For example China invested $34.6 billion in the clean energy sector in 2009; the US invested $18.6 billion. Already most of the solar panels and wind turbines being installed in the United States are made in China. This is one technology sector where the United States can’t afford to be left behind, but that’s exactly what’s happening.

This is the place where you make the argument, “In this country we don’t believe in industrial policy, choosing one industry over another. That’s socialism. We leave it to the market to sort things out.” OK, then what do you call R&D tax breaks and tax holidays that even the Reddest states (like Texas) use to lure high-tech industries?

You can make the case that industrial policy doesn’t always work. In the 1980s MITI in Japan heavily bankrolled the semiconductor and computer industries, resulting in Japanese companies becoming dominant in those sectors worldwide. That dominance was eventually overcome by more entrepreneurial American companies, though not without considerable assistance from and strong-arming on the part of the US government.

I’d argue that without MITI’s guidance and backing you would never have heard of today’s giant Japanese semiconductor and computer firms. The same goes for ETRI in Korea and ITRI in Taiwan, both of which nurtured promising local companies to international prominence.

Investing in the Future

As the authors of the LA Times article point out,

“And we need to recognize that the cost of basic science, and the time it takes, require a sustained government commitment because industry can’t be relied on to fund incremental and high-risk science for its own sake without any guarantee of payoff.”

When it comes to R&D, American companies tend heavily toward the D or development side, with pure research being left to universities. This works as long as research grant funding holds up, but that’s now in the process of disappearing. The national Institute on Aging is now turning down more than 90% of scientifically meritorious research grant proposals due to an inability to finance them. Leading research universities such as my alma mater the University of California Berkeley are falling behind as their funding sources dry up. The industries and startups that rely on this research will suffer as a result.

As philanthropist Mary Lasker remarked, “If you think research is expensive, try disease!” As the baby boom generation continues to age, our annual federal Medicare expenditure on Alzheimer’s is projected to reach $627 billion, well in excess of the entire Medicare budget for 2010 ($468 billion). Humanitarian considerations aside, putting tens of millions of dollars per year into Alzheimer’s research could be one of the best investments we’ve ever made.

Don’t Just Say No

The current Republican “Pledge to America” calls for a reduction in federal spending on non-defense-related science research to pre-stimulus levels. The National Institutes of Health, the National Science Foundation and other major sources of research grants are facing substantial budget cuts. This is shortsighted in the extreme.

The whole concept of investing involves putting money into what looks like worthwhile ventures now in the hopes of greater returns later. We need to continue to invest in our future if we want it to be somewhere we’d like to be.

Posted in Uncategorized | 2 Comments

Mentor Graphics Acquires CodeSourcery

Mentor logo

December 2, 2010—Mentor Graphics has acquired most of the assets of CodeSourcery, the leading provider of open source GNU-based tool chains and services for advanced systems development. The acquisition builds on Mentor’s acquisition last year of Embedded Alley, whose runtime Linux and Android offerings moved Mentor firmly into the open-source embedded arena. The tools they’ll acquire from CodeSourcery should neatly leverage their investment in Embedded Alley and further solidify their position in the embedded space. Mark Mitchell, former CEO of CodeSourcery, will stay on as director of embedded tools for Mentor Graphics Embedded Software Division.

 With Mentor’s Nucleus RTOS powering over two billion mobile handsets, Mentor is hardly a Johnny Come Lately to the embedded market. Now in addition to their proprietary RTOS they can also offer Linux, Android and leading open-source tools for developing embedded systems. With Linux having a dominant share of the embedded market and Android a rapidly increasing presence in handsets, Mentor has positioned itself to ride these fast growing markets. The Sourcery G++ GNU-based IDE has a wide following in the open source community.

 Mentor isn’t the first software company with a proprietary RTOS to see the open source light. Wind River was doing very well with its VxWorks RTOS and initially spent time diss’ing Linux before deciding it made more sense to ride that horse than race against it. Today they have their own fork of Linux that integrates with their tools and databases, all of which opened up new markets for them and gave them some more credibility in the open source world. They acquired enough expertise in the process that Intel acquired them for it.

The OS business model nicely supplements the EDA industry’s standard charge-by-the-seat revenue model for tools, where you get paid once per year for a very finite number of seats. OTOH royalties on your RTOS that’s in a half billion handsets every year scales nicely. And you don’t have billions in fab costs to offset before you can make a profit.

EDA companies are all trying to find ways to grow faster than the slow pace of the EDA market. The Big Three each have different—if not that different—strategies. Synopsys continues to build a large portfolio of semiconductor IP that they guarantee will work together, easing the pain of IP integration and verification. Cadence’s EDA360 envisions an application-driven approach to system design, around which they’re reorganizing their tool offerings, not to mention their company. Mentor’s approach to faster growth is clearly to move more forcefully into the embedded space, adding a lot of open source tools to their toolkit. The extensive ecosystem and customer base that comes with Mentor’s two open-source acquisitions takes the edge off any risk.

Posted in EDA | Tagged , , | 1 Comment

In Search of 500 MHz for Wireless Broadband

Everyone likes high-speed wireless access for their mobile devices. Unfortunately while the demand seems limitless, the supply is highly limited. Almost all available spectrum from microwaves on down is already allocated and increasingly crowded.

Despite this inconvenient truth–or rather because of it–on June 28, 2010 President Obama signed a Presidential Memorandum that directed the Secretary of Commerce, through the National Telecommunications and Information Administration (NTIA), to collaborate with the FCC to produce a ten-year plan and timetable for making available 500 megahertz of Federal and non-Federal spectrum suitable for wireless broadband use, while taking into account the need to ensure there is no loss of existing critical government capabilities and the need for appropriate enforcement mechanisms and authorities.

The NTIA and DOC have subsequently release two reports that include:

  1. A Ten-Year Plan and Timetable to make 500 megahertz of Federal and non-Federal
    spectrum available for wireless broadband use; and
  2. Fast Track Evaluation of the 1675-1710 MHz, 1755-1780 MHz, 3500-3650 MHz, and
    4200-4220 MHz and 4380-4400 MHz bands.

The NTIA has identified 2,200 megahertz of spectrum to evaluate for wireless broadband opportunities, including the four fast-track bands, as candidate bands for review. The report provides a roadmap for identifying wireless spectrum assigned to both Federal and non-Federal users that can be allocated for wireless broadband, as well as for using all spectrum more efficiently.

A detailed fact sheet spelling out the highlights of the reports is available here. The bottom line is that it should be no problem to find a spare 500 MHz of useful spectrum out of the proposed 2,200 MHz…after the FCC holds more meetings to hear from and address the concerns of a wide range of stakeholders.

That won’t be easy and it won’t happen overnight. But at some point it will happen. When it does, expect to see an outpouring of applications to take advantage of the newly opened spectrum. When the FCC opened up the ISM bands to unlicensed devices it gave rise to new wireless technologies that created multi-billion dollar industries. I’d expect nothing less the next time around.

FCC bandplan

Posted in FCC, NTIA, regulation, Spectrum, Wireless | 1 Comment

Cadence Advocates Application-Driven Design

cadence_logo2At last week’s CDN Live event, the Cadence team detailed their vision for the future of the EDA industry: not point tools for engineers but standards-based, end-user oriented application–driven design.

CEO Lip-Bu Tan pointed out that the $350B electronics industry is still growing fast despite the recession, with much of that growth driven by an explosion of applications. Meanwhile the EDA industry is ‘mature’, plagued by slow growth and uninspiring stock performance. There has to be a better way, and everyone in the industry is looking for it. Cadence has an interesting approach.

EDA360—first announced in April—looks to move Cadence beyond design tools to complete designs. The first step is to expand your IP portfolio. Cadence’s recent acquisitions of Denali and Taray were strategic in that regard. Synopsys, with more cash to throw around, is racing down this road through its acquisitions of Virage Logic and Coware, emphasizing IP that’s guaranteed to work together—a major pain point for chip designers. Mentor Graphics has been building its IP portfolio for years, focusing on the embedded space.  Cadence hopes to quickly expand its IP portfolio through “increasing cooperation with [our] ecosystem partners,” including IBM, TI, ARM and TSMC.

“There’s an app for that!”

Having accumulated a lot of IP, what do you do with it? Design stuff, using your suite of EDA tools as a virtualization platform that can turn system-level models into optimized hardware/software systems to support a range of software applications. Cadence has some gaps to fill in its tool flow to pull this off, but EDA360 is a roadmap to more than just more tools.

The key word here is ‘applications’, which CMO John Bruggeman claims need to drive hardware/software design, a process he calls “application-driven system realization.” According to Bruggeman, the current design model is suboptimal at best: “If hardware is designed first and software is appended later, it is difficult to optimize at the system level.” No argument there.

Then in 2006, according to Bruggeman, Apple came up with a new model with its iPhone: design optimized proprietary hardware and software platforms that work together. Application developers had to work within the limitations dictated by these platforms, which arguably limited their functionality, but if they did so everything worked well.

Instead Bruggeman advocates starting with an understanding of the software applications that will run on a given hardware/software platform, defining system requirements and then working down to hardware and software IP creation and integration. This entails reconfigurable hardware driven by software, and not just the operating system. “The OS…has a very generic understanding of the underlying software,” so applications need to take direct control of hardware functions. That involves EDA companies delivering “an optimized IP stack—from bare-metal software to the physical layer—that allows visibility into and control of the hardware from the OS and the application.”

Bruggeman has been advocating silicon-aware IP—not to mention virtual platforms—since his days at Wind River; the company worked with semiconductor firms to develop software drivers to enable VxWorks to work closely with leading processors. Intel saw the value in closer OS/silicon integration, which is why they bought Wind River. At Cadence Bruggeman has extended the vision upward to the application layer. This is hardly uncharted territory, but here the solutions aren’t quite so obvious.

How do you determine application requirements in advance when there are hundreds of thousands of them competing to be on your iPhone? And what happens when you have 60 of them running at once in the background with five of them active? How do you design for that? The answer would seem to be a hardware/software platform that takes its orders from the applications and does, within limits, whatever is needed to service them, either singly or in bulk. That’s what the OS is supposed to do. Perhaps what we need is smarter OSs. The alternative could be “over-designed hardware,” which is what Bruggeman says his approach is designed to avoid. Bruggeman points to Android as an example of an open “multi-application platform,” which does clarify his frame of reference.

How Hard Can It Be?

A system based on a reasonable set of assumptions anticipating the demands made by a well-defined range of applications is not an unreasonable idea. What Cadence is proposing is basically an ESL tool suite customized to generate complete applications. To supply that they’ll have to supply not just the tools but also an embedded software infrastructure—with a full suite of tested, integrated and verified IP—plus an OS, middleware and reference applications. No small task, and not one that any EDA company is even attempting today, though you could argue that they’re moving in that direction. Cadence certainly is.

Supplying reference designs could put Cadence in conflict with its semiconductor customers, who have long had to supply reference designs—right down to the Gerber files—in order to secure design wins. National Semiconductor’s online WEBENCH tools enable you to design fairly complex systems and generate a complete BOM, including non-NSC chips. They’re moving into EDA turf while Cadence plans to move closer to theirs. Bruggeman doesn’t foresee a conflict here, but I see plenty of potential for one.

Being able to supply a range of reference designs is going to require a lot of systems-level domain expertise, something that is in short supply in EDA companies. Semiconductor companies have been acquiring it for years in very specific areas in order to win sockets with key customers. But more generalized expertise is time consuming and expensive to acquire. Xilinx has set out to develop a range of FPGA-based reference platforms for some high-margin industries (medical, automotive, mil/aero) and they’re doing a good job of it, but this is a steep hill to climb.

Are We There Yet?

EDA360 calls for application-driven hardware/software creation, integration and verification. Getting there involves a lot more than filling in some gaps in disjointed tool flows. Even bracketing the idea of application-driven design for the moment, this all requires a smooth ESL design flow from system-level modeling right through tape-out, with early models being able to generate detailed, integrated hardware and software designs. We aren’t nearly there yet, though we’re getting closer (perhaps asymptotically).

Truth be told a lot of the focus on applications is driven by the attractive business model. When you buy a TV, for example, Samsung gets money from you once every 10-15 years. If you buy an EDA tool, you pay annually for seats. But if your TV boots up and you can buy a lot of cool apps for it, Samsung gets a nice ongoing cash flow—and Cadence, if they designed the apps or licensed the IP used to create them. That and better stock valuations are the lure behind applications-driven design.

EDA360 isn’t the only way forward for the EDA industry, and in a lot of ways it’s a restatement of what everyone else has long been doing. From that point of view Cadence has dug it way out of a hole, is executing well and has belatedly joined the party.

To be fair the EDA360 vision goes well beyond what most companies are attempting, and some parts of it are fairly radical. Neither Cadence nor its competitors are currently able to execute on all of it, but it does reveal a reinvigorated company with a vision, one that inevitably will be clarified over time. Cadence has spent a lot of time trying to get everyone in the company to understand the concept and get on board. That’s a necessary first step and in itself a major accomplishment on the part of the company’s new leadership.

As Gary Smith said recently, “This is the best Cadence to date.” I quite agree.

[To watch video interviews with Cadence execs re. EDA360 click here.]

Posted in Uncategorized | Leave a comment

Otellini and Perlmutter Kick off IDF

Paul Otellini kicked off this year’s Intel Developer’s Forum (IDF) declaring, “Intel used to be a chip company…now we’re becoming a solutions provider.” Intel’s just the latest semicon firm to be forced to move up the food chain, offering more than just chips.  Software and systems engineers can sign up here.

The PC market isn’t exactly stagnant, with over one million PCs shipping per day according to Otellini, adding to the over 1.4 billion PCs out there today. Gartner predicts the PC market will continue to grow at over 18% per year through 2011, though other analysts question whether this mature market can support the aggressive growth for Intel that Otellini has promised investors.

The answer is where the action is: in the explosive growth in Internet-connected devices—over 5 billion now, of which Otellini estimates probably half are so-called “smart devices,” a figure he predicts will grow to 5 billion by 2014. Intel wants a piece of this action, and how they plan to get there was at the heart of this morning’s keynotes.

The goal is to break out of the PC box and extend the Intel architecture (IA) into a wide range Internet-connected devices, offering “a full PC-compatible stack” to developers and a way to seamlessly transport and utilize music, video and data between different devices. Otellini refers to this as “port choice…The whole world is about apps” and users expect them to work the same way on PCs, netbooks, smartphones, tablets, etc. As Davi Perlmutter put it in his keynote, “Users want a seamless computing experience.” Intel’s acquisition of Wind River was a move in this direction.

Intel’s also focusing on wireless connectivity between devices. It acquired Infineon’s wireless group for its 3G and LTE technology, giving them a potential entry point into handsets, where ARM has a lock on processor sockets. They purchased TI’s cable modem operation to push into the Internet TV market—Intel’s partnership with Google to create Google TV being a case in point.

On the chip level Intel’s big play this IDF is its Sandy Bridge processor, which incorporates a graphics processor (GPU) on the same die as the CPU. Despite an extended demo of gaming graphics, it’s unlikely that any serious gamer is going to go out and buy a PC without a separate GPU. Otellini referred to Sandy Bridge as “revolutionary,” which it may be for Intel but not the rest of the world. Still, it’s a big step forward for Intel in terms of graphics processing speed, which Perlmutter said has increased 25x over that Intel chips could deliver as recently as 2006.

Now if Intel can come out with a low-power version of Sandy Bridge that’s suitable for a wide range of portable embedded devices, then they’d really have something. Expect to see a lot of action on that front.

Posted in IDF, Intel, semiconductors | Tagged , , , , | Leave a comment

Ultra Low Power Electronics in the Next Decade

As a TI Fellow and director of TI’s Kilby Research Labs, Ajith Amerasekera’s job is to predict the future and plot a roadmap to it. His keynote at day two of the low-power electronics show (ISLPED) in Austin—“Ultra Low Power Electronics in the Next Decade”—did both. [Spoiler alert:] There are some major bridges to be crossed and the arrival end point is far from guaranteed.

Ajith_slide3Just as they have for the last several years, portable devices will continue to drive growth in the electronics industry. Far from just handsets, the mobile internet—also encompassing “the internet of things”—represents a huge expansion of the semiconductor application space to include a wide range of wireless home entertainment, automotive safety and autonomous industrial, military and medical devices. The mobile internet promises to be 10-100x larger in unit volume than the desktop internet ever was.

Amerasekera distinguishes between two types of portable electronics: performance “hub” devices such as computers, multi-media devices, wireless hubs and PDAs which have 1W to 5W needs today; and distributed, largely autonomous systems with micro and nano watt needs. A typical autonomous system—for example, wireless strain gauges in bridges and aircraft wings—has a life expectancy of up to 10 years. Assuming such a device is powered by today’s typical 100 mAh cell phone battery, the average power available from the battery is less than 1 µW. That isn’t possible with today’s technologies.

Ajith_slide10The problem is that battery technology has been scaling at about 2x every 10 years compared to semiconductor technology, which scales 2x every 18 months. The gap between what portable electronic devices demand and what batteries can deliver will continue to grow. Don’t expect much improvement from the battery camp any time soon. “The energy density of lithium-ion batteries is so high that they’re really like small hand grenades,” said Amerasekera. There isn’t much left on the atomic scale that has a higher energy density and isn’t radioactive.

How Do You Manage?

Ajith_slide5Lacking more capable batteries, silicon performance advances require power management. A lot of very effective techniques have been developed over the last several years. At 65 nm leakage power was reduced 300x vs. what it had been at 90 nm through a combination of SDRAM retention, logic power gating, channel length reduction, logic retention, process/temperature AVS and dynamic voltage and frequency scaling (DVFS). At 45 nm new techniques were devised—including adaptive body bias (ABB) and Retention ‘Til Access (RTA)—that resulted in 1000x reduction in active power. Still, Amerasekera—like Jan Rabaey in his keynote yesterday—is concerned that we’ve run out of tricks at the component level that will scale.

Ajith_slide21According to Amerasekera, future advances in ultra-low-power electronics will come at the system level. He gives the example of running an FFT in software, which requires 28 uW. Running it in hardware requires only 1.6 uW, an 18x improvement. Dropping the core voltage yields a further 1.8x power savings, for a total improvement of 28x. The SoC running the FFT now draws < 1uA.

3D chip techniques have finally evolved to the point where they can help optimize bandwidth, power and area. Currently 3D means package on package (POP) or stacked die. FinFET technology now enables more dense dies, and and die-to-die interconnects—vias connecting disparate digital, analog and RF layers—are becoming…viable.

Ajith_slide26Renewable energy—wind, solar, hydro and heating systems—has tremendous potential, though they’re all faced with economic as well as technical challenges. Energy harvesting also has a lot of potential, but efficiencies of such systems are quite low, as is the amount of energy they can deliver. Still, there’s a place for them going forward.

Divide and Conquer

Basically, the mobile internet will need a variety of energy sources:

  • Batteries for general functionality
  • Storage caps for high current functions
  • Energy scavenging for extended battery life
  • Wireless power sources for connection to the grid

The mobile internet will also require intelligent energy management and control, including

  • Highly efficient on-chip power processing
  • Control of energy sources and delivery
  • Management of power demand and access
  • Unreliable energy sources (aka wind, solar, etc.)

Ajith_slide29The challenge for the next decade will be coming up with another 2-3 orders of magnitude of power reduction to meet the demands of an increasingly wireless world.

Engineers always enjoy working on interesting problems, and this one should stay interesting for years to come.

Posted in Batteries, Clean energy, Energy Efficiency, Power management, semiconductors | Tagged , , , , | Leave a comment

For Energy Efficiency, Forget Accuracy

statistics

U.C. Berkeley professor Jan Rabaey kicked off the 2010 International Symposium on Low-Power Electronics and Design (ISLPED) today in Austin with a challenge to programmers, hardware designers and their EDA tool providers: The deterministic Turing model has hit the power wall. If you want energy efficiency, forget accuracy. Consider statistical, even analog computing.

In his keynote talk—“Going Beyond Turing: Energy Efficiency in the Post-Moore Era”—Rabaey claimed that despite a decade of advances in energy efficiency—including dynamic and adaptive voltage scaling, architectural innovations and other clever power management techniques—“waste has been eliminated and we’re basically running out of options” at smaller geometries. In fact at 22 nm and below, energy savings won’t scale any more. Leakage is now the major problem, and scaling makes it worse. ISSCC 2011 will feature a general session panel that will brainstorm how to get the next major power reduction as you scale down.

It’s Hard to Determine

All computers today are built on a deterministic model; with the same inputs you get the same outputs every time. But what if you don’t need complete accuracy—if “being in the ballpark” is good enough? According to Rabaey, “If you’re willing to back away a bit from accuracy, you can gain quite a bit in efficiency.” You can get along with a lot less computing power if you’re willing to accept a range of outputs.

Why not in fact do the computation in analog? The outcome will be not a single number but a distribution. Analog is inherently accurate for simple computations, but improving accuracy is expensive in terms of energy efficiency, since in analog circuits there is an exponential relationship between the signal-to-noise ratio (SNR) and power. For up to a 30 dB SNR, analog does a good job; but to get better SNR, the power requirement goes up fast. There are a lot of applications that do well with a low SNR. While not exactly an application, the basically analog human brain works very well at a low SNR. Add a room full of screaming 7-year olds and its efficiency falls off a cliff.

Statistical computing doesn’t rely on probabilistic algorithms and it’s not the same as Boolean networks. Its inputs are deterministic or stochastic variables and its outputs are a range of numbers that follow a distribution curve. It relies on algorithms that display resilience in the presence of uncertainty (“noise”) and can still make reasonable estimates—with varying degrees of certainty—within parameters determined by the degree of uncertainty associated with the inputs.

ERSA and ANTs

Rabaey cited work done at Stanford on the Error Resistant System Architecture (ERSA), an attempt to define a hardware/software architecture that supports statistical computing. ERSA has resulted in significant power savings in dealing with streaming video, adding little detectable noise in the process. Rabaey also cited experiments on algorithmic noise tolerance (ANT) that resulted in a 2.5x energy saving for a barely detectable increase in error rate.

Rabaey pointed out that some applications are well suited to statistical computing techniques while others are not. Adding a bit of noise to streaming video is a reasonable tradeoff if it results in a major power saving. In contrast, medical, military and any mission-critical computing tasks need to remain deterministic. “Some errors roll off smoothly, while some are catastrophic.” And in any application, changes to the least significant bit (LSB) in a byte may be insignificant, while changes to the most significant bit (MSB) clearly are not.

Statistical computing may hold great promise, but designers and programmers need to change their thinking and EDA vendors need to deliver the tools. “We must start building statistics into all levels of the design process,” Rabaey concluded. “We have to break determinism. VHDL and Verilog are purely deterministic languages. We must spend time on error modeling of our hardware.” For this to happen, “The EDA community really needs to break out into the application space.”

Mentor, Synopsys, Cadence: the ball’s in your court, folks.

Posted in EDA, Energy Efficiency | Tagged , , | Leave a comment

Is Cell Phone Radiation a Health Hazard?

Everyone with a microwave oven knows that radio waves can heat up water molecules, which is handy when you’re making Mac and Cheese for the kids but a little more problematic when you spend hours every day with a cell phone pressed to your ear. Are cell phones a health hazard?

In May the $24 million Interphone study was finally released, after 10 years of study in 13 countries with 13,000 participants. The Interphone study included 2,708 cases of glioma and 2,409 of meningioma, another type of brain tumor, with a total of over 5,634 controls —from 13 countries. Eligible cases were patients diagnosed between 2000 and 2004. (Meningioma was not linked to cell phone use.) It is the largest study of cell phone and tumors ever done.

The point of the study was to determine once and for all whether cell phones represent a health hazard.

Read full post

Posted in Cell phones, Radiation | Tagged , , | Leave a comment

Mentor Passes Cadence in EDA Market

Gary Smith EDA have just released their 2009 Market Share study of the EDA market, compiled by analysts Nancy Wu & Mary Olsson. Synopsys continues to hold first place but for the first time Mentor Graphics managed to edge out Cadence for the #2 slot:

2009 EDA Market Share

2009 EDA Market Share

Smith told Low-Power Design that the numbers reflect strictly tool sales and exclude business in peripheral markets in order to give the most accurate picture of EDA market share.

According to the report,

The biggest change in 2009 was Mentor passing Cadence to become number two in product sales in EDA.  This is an indication of the market shift caused by the move into the ESL Methodology.  Synopsys remains a strong number one.  We believe that the recent changes at Cadence have stopped their market share decline, similar to the changes made at Mentor, bringing in Wally Rhines during the switch to the RTL design methodology.

While I can hear the champagne corks popping in Wilsonville as far away as Austin, I’m sure that just as in the old Avis ads Mentor’s new motto will be “We’re #2, so we try harder!” With Cadence regaining traction, this will be a real horse race–and one from which all of us can only benefit.

Posted in EDA | Tagged , , , , | Leave a comment

The Essential Embedded Processing Tool

embedded_processing_directory_300x100Today Embedded Insights guru Robert Cravotta launched the Embedded Processing Directory. I’m in awe at how much work obviously went into creating this 191-page detailed compendium. I don’t know what Robert’s business plan is, but free is a great price for such a valuable resource.

And quite a resource it is. The Directory lists detailed information about processors and cores from over 80 companies, with 23 columns of information about each processor, making it easy to compare based on almost any criterion you choose. You can download “all processors by company” as I did and browse the whole directory, or you can download 25 different reports that sort and filter the data in different ways, including target application, instruction set architecture and device architecture, size and type. If that isn’t granular enough, you can click on a device or part name to see supplemental information for that part. Still not satisfied? Click on the company’s name to search their site.

You’re no doubt familiar with several major embedded processors, but there are hundreds of others out there that may be better suited to your next design. The Embedded Processing Directory can save you endless hours of digging through data sheets, making an informed decision a lot faster and easier.

What’s not to like? Download it today from the Embedded Insights site.

Posted in Embedded | Tagged , , , | Leave a comment