The cloud and big data are presenting the engineering community with a variety of opportunities in the design, test, and use of cloud computing systems. Challenges range from implementing low-power servers that are central to the cloud to making sense of the terabytes of data it stores.
Building the Cloud
Power consumption is a critical specification in cloud implementations. A recent article1 from The New York Times reported that yearly leases top out at about $150 per square foot for a trophy high-rise in Manhattan; they reach four times that for space in “low, bland buildings across the Hudson River in New Jersey.” That’s because the New Jersey buildings have sufficient electric capacity to power the data centers associated with the stock exchanges, banks, and high-velocity traders. The Times reported that the owners of the New Jersey buildings have evolved from landlords into power brokers: “Electrical capacity is often the central element of lease agreements, and space is secondary.”
The growing power consumption indicates the need for improved server efficiencies though technologies like Hewlett-Packard’s Moonshot. According to HP, “If the public cloud were a country, it would rank fifth in electricity consumption.”
As HP president and CEO Meg Whitman put it, “With nearly 10 billion devices connected to the Internet and predictions for exponential growth, we’ve reached a point where the space, power, and cost demands of traditional technology are no longer sustainable. HP Moonshot marks the beginning of a new style of IT that will change the infrastructure economics and lay the foundation for the next 20 billion devices.”
The HP Moonshot system is built around what the company calls “the world’s first software-defined server.” The servers are designed to optimize performance for specific workloads. They share management, power, cooling, networking, and storage. In early April, HP unveiled its first commercially available HP Moonshot system, which, the company said, uses up to 89% less energy, takes up 80% less space, and costs 77% less compared to traditional servers based on x86 technology.
The new HP Moonshot system consists of the HP Moonshot 1500 enclosure and application-optimized HP ProLiant Moonshot servers (Figure 1). The ProLiant Moonshot server is available with the Intel Atom S1200 processor and supports web-hosting workloads.
Figure 1. ProLiant Moonshot Server
Although the first server uses an Intel chip, the company will work with other chip vendors as well through the HP Pathfinder Innovation Ecosystem. Participants include AMD, AppliedMicro, Calxeda, Intel, and Texas Instruments. In addition, Lakshmi Mandyam, director of Server Systems & Ecosystem at ARM, wrote, “HP’s Moonshot system and the HP Pathfinder Innovation Ecosystem announcements are closely aligned with ARM’s philosophy and approach to solving technology and business challenges. For the past four years, ARM has been working on bringing the low power leadership we have demonstrated in the mobile and consumer markets into the server space.”2
TI recently announced its intention to bring ARM technology to the Moonshot project through application of its KeyStone II-based multicore SoCs. TI’s KeyStone II-based SoCs integrate fixed- and floating-point TMS320C66x DSP cores with multiple ARM Cortex-A15 MPCore processors, packet processing, security processing, and Ethernet switching.
TI said the new SoCs offer customers more than four times the capacity and performance at the same power relative to existing x86 devices, specifically for voice-transcoding applications. TI attributes the performance in part to the C-programmable floating-point C66x DSP cores.
“The scalability and high performance, coupled with the low power requirements of the HP Moonshot System, enable customers to develop solutions that address ever-changing and demanding market needs in the high-performance computing, cloud computing, and communications infrastructure markets,” said Brian Glinsman, a vice president at TI, in a press release. He said the company’s SoCs deliver the necessary performance and a low-power envelope that customers require.
Testing the Cloud
The capability to test the cloud and related infrastructure advanced in May when Spirent Communications announced the launch of its next-generation high-speed Ethernet test systems, which include new high-density test modules and a high-performance chassis that enables doubling of port densities while lowering power consumption. Specifically, the Spirent dX2 is an eight-port 40GbE/32-port 10GbE dual-speed test module in a single-slot configuration supporting up to 96 ports of 40GbE onto a single chassis (Figure 2). The dX2 series combines both 40GbE and 10GbE functionality in several density and speed form factors to provide a flexible system for testing high-performance, low-latency top-of-rack switches, end-of-row switches, and data center fabrics spanning hundreds or thousands of ports.
Figure 2. dX2 Test Module
In addition, the Spirent fX2 40/10GbE dual-speed test modules combine Spirent’s layer 2-7 traffic generation and analysis functionality with scalable network emulation. The fX2 supports up to five 40GbE and 20 10GbE ports per slot, making it suitable for functional, conformance, and performance testing of service provider, data center, SDN, or cloud infrastructure environments.
The company also announced that its Spirent fX and Spirent mX two-port single-slot 100GbE test modules now support CFP2 optical transceivers. Targeting the testing of high-density service-provider core routers and high-speed Ethernet cloud infrastructure, the fX 100GbE module validates data-plane QoS performance over realistic cloud infrastructure topologies. The mX 100GbE module featuring Spirent’s Cloud Core, a patent-pending technology designed to add elastic computing to the Spirent Layer 2-7 performance software platform, is suitable for testing the most complex service provider edge/core routers and high-performance networks.
And finally, Spirent added the SPT-N11U chassis to support its lineup of high-speed test modules. The SPT-N11U offers double the levels of port density, an Intelligent Power Control function that lowers the power consumption of high-scale test beds by up to 60%, four-fold improvement in boot time, and a 400GbE-ready architecture.
“Driven by the proliferation of mobile devices and cloud computing, carrier and data center networks are becoming increasingly complex and are forced to handle exploding levels of data traffic,” said Ahmed Murad, vice president of product marketing and management at Spirent, in a press release. “Spirent is committed to developing test solutions that validate these complex networks and their underlying technologies—with realism, at high scale, and at high density.”
Using the Cloud
A key application of the cloud for the engineering community will be storing data derived from the physical world, which National Instruments referred to as “Big Analog Data” in its Data Acquisition Technology Outlook 2013.3 That data results from myriad sources, including vibration, temperature, pressure, sound, light, magnetism, and voltage measurements as well as image and RF signal acquisition.
NI noted that with the arrival of big data, the key is not simply trying to collect the most data but rather to make sense of the data collected. That will require contextual data mining, or the use of metadata stored along with the raw data, to help manipulate stored data and locate points of interest within it.
NI presented a simple example: “…examine a series of seemingly random integers: 5126838937. At first glance, it is impossible to make sense of this raw information. However, when given context—(512) 683-8937—the data is much easier to recognize and interpret as a phone number.”
Descriptive information in a measurement system could include details such as sensor type, manufacturer, or calibration date for a given measurement channel, or it could provide information about overall component under test. “In fact,” NI noted, “the more context that is stored with raw data, the more effectively that data can be traced throughout the design life cycle, searched for or located, and correlated with other measurements in the future by dedicated data post-processing software.” In addition, you can help ensure better use of your collected data by incorporating intelligent DAQ nodes that can help make decisions on what data to collect and when.
In a sidebar4 within Data Acquisition Technology Outlook 2013, Matt Wood, senior manager and principal data scientist at Amazon Web Services, wrote, “The unification of DAQ hardware and onboard intelligence has enabled systems to be increasingly embedded or remote and, in many industries, has paved the way for entirely new applications.” With the advent of the Internet of Things, the physical world is becoming embedded with intelligence. The ability to process and analyze the resulting data sets, he wrote, will have profound effects across a massive array of industries, including healthcare and fitness, energy generation, transportation, building automation, and insurance.
And the cloud’s function won’t be limited to storage. “The near infinite computing resources in the cloud provide an opportunity for software to offload computationally heavy tasks,” Wood concluded.
1. Glanz, J., “The Cloud Factories: Landlords Double as Energy Brokers,” The New York Times, May 13, 2013.
2. Mandyam, L., “Moonshot—a shot in the ARM for the 21st century data center,” Smart Connected Devices ARM Blog, April 9, 2013.
3. “Big Analog Data and Data Acquisition,” Data Acquisition Technology Outlook 2013, National Instruments, 2013, p. 4.
4. Wood, M., “The Rise of Cloud Storage and Computing,” Data Acquisition Technology Outlook 2013, National Instruments, 2013, p. 7.