Q & A

  1. A New Picture of Flash Reliability

    A New Picture of Flash Reliability

    How do you assess flash reliability? The largest study of flash drives in production — six years, 10 device models, three types of flash, millions of drive days — offers new insight.

    The best way to examine the reliability of a technology, one could argue, is by looking at how the technology actually performs in real-world production — rather than, say, in a lab experiment under controlled conditions, using a small number of drives, and under synthetic workloads. That was the premise for a research study on the reliability of flash drives presented at FAST ’16, the 14th USENIX Conference on File and Storage Technologies, February 22nd -15th, 2016 in Santa Clara California and titled: Flash Reliability in Production: The Expected and the Unexpected.
    Covering a six-year period and millions of drive days, the study looks at reliability data from 10 device models with three types of flash technologies (MLC, eMLC, and SLC) and with feature sizes ranging from 24nm to

    Read more »
  2. What Intel Is Saying with Skylake

    What Intel Is Saying with Skylake

    It is not a question of if, but rather when, your industrial or embedded computing solution will leverage Intel’s sixth generation core technology in order to stay relevant.
    Whenever Intel (or any chip maker) introduces a new generation of products the vendor naturally lists all the improvements in performance, features, feature density, and power consumption as reasons to upgrade. But what was particularly noteworthy about Intel’s August introduction of Skylake was that Intel deliberately drew those comparisons against Westmere chips from 2010, not Haswell CPUs you might have bought last year. The fact is that there are a billion five-year-old PCs in use today. One reason: many embedded/industrial applications specify a PC built only with an approved list of components. For companies to go through the hassle and expense of a solution redesign and approval, product improvements can’t just be incremental.
    That is the case Intel is making now. For example, the Skyla

    Read more »
  3. What the Heck is a Quark?

    What the Heck is a Quark?

    Many individuals have likely never heard of the Intel Quark before, unless, they’re affiliated with industrial or embedded computer systems – so, what, you might ask, is a Quark? The Quark was designed by Intel to specifically cater to the demands of the unavoidable information-sharing-milestone referred to as The Internet of Things (IoT). In short, IoT is an environment in which physical objects are capable of sending, and receiving, pertinent information across a network – without the involvement of a human-being. The Intel Quark operates as a gateway for this information as it travels to the cloud.
    Advantech’s new UNO-1252G uses Intel’s Quark SoC x1000 Series chipset. It is a 32-bit x86 System on a Chip “designed to bring all the technology that we’ve had for the past 20 years into today’s applications” says Peter Dice, a Software Engineer at Intel. The quark uses single-core, single-threaded technology and is designed on the smallest core currently offered by Intel. The sma

    Read more »