Time to get ‘Edge-ucated’
The term ‘Edge Computing’ is being used to describe a number of different environments by many different companies in order to bend the definition to sell their version of the truth – nothing wrong with that, other than it makes things very confusing. Edge Computing is actually quite simple, think of it as moving the data processing power closer to where the data is being generated – closer to the source.
This is obviously counter to what virtually all of industry has been developing and building over the past ten years – Cloud Computing, send all the data to the Cloud, use it’s ‘infinite’ processing power to process and analyze, then return the results to the user. There’s nothing wrong with Cloud Computing until you try to break the laws of physics – bandwidth has finite limits, or the economics to send massive amounts of data are cost prohibitive. And, even if you could get all the data to the Cloud, you wouldn’t be able to process/analyze the data in real-time, which is becoming more and more necessary in many industries. So, the current use and models of Cloud Computing are now starting to break down – enter, Edge Computing!
Edge Computing is poised to resolve the issues now starting to impact Cloud Computing. Being at the Edge is optimal for ingesting big data fast and storing a lot of it, 100s of terabytes, while simultaneously processing it to do things like comparing the most recently ingested data against the stored data set to detect anomalies in real time. It is now practical to simultaneously run forensic analysis against the entire data set in place to explore whether detected anomalies represent a current threat, all the while continuing to ingest data without slowing down or dropping anything. Requirements like this now occur in applications like cybersecurity, detecting fraudulent or erroneous activity in financial markets, or identifying that something is going awry with operations of critical infrastructure or industrial processes. For insights and actions to come soon enough to be truly useful, big data must often be ingested, stored, and processed right there at the Edge where it is created.
Packet Capture and Streaming Analytics
Axellio’s packet capture and analysis solution integrates advanced FPGA accelerated network interfaces and PTP timing cards with the latest x86 architecture and next generation NVMe on PCIe Fabric solid state storage (SSD). By leveraging X-IO’s unique FabricXpress™ Architecture, the result is a powerful multi-use solution offering, simultaneous packet capture and persistence, network analytics, alerting, and the ability to run real-time reporting even during the heaviest of traffic periods.
With almost 500TB of dual-ported high-speed NVMe SSDs, the capture solution can support multiple 40Gbps or 100Gbps interfaces with internal data rates of up to 60 Gigabytes per second (full duplexed.) X-IO’s Axellio Edge Computing System combines the latest advances in NVMe solid-state storage technology and FPGA accelerated network capture and timing cards to form a solution designed for the most demanding environments.
Ascolta’s enterprise clients require high performance edge computing and with Axellio, we meet the requirements of the most demanding environments. Our customers across federal, state and local and commercial need to access information in real time and respond to high volumes of data rapidly. Through this offering, we give customers this level of access, thereby enabling them to more confidently make critical decisions.
Senior Vice President of Strategy
Real-Time Big Data Analytics
Real-time big data analytics enables organizations to extract immense value from data-in-motion in ways that traditional approaches cannot satisfy. Complex analytics allows companies to do event processing against massive volumes of data streaming into the enterprise at high velocity. Data streams can contain an immense value for the organization but the challenge of handling such large amounts of real-time I/O has meant that traditional IT infrastructures have been required to compromise by separating ingest and analytic operations (increasing costs for hardware and software). Axellio Edge Computing Systems enable higher volumes of data to be processed with a fraction of the hardware required
Axellio Edge Computing Systems enable higher volumes of data to be processed with a fraction of the hardware required for traditional compute approaches. Axellio empowers complex analytics on massive-scale streaming data and delivers a flexible environment for turning this data into valuable assets.
“Thanks to Axellio, we are able to accelerate the analytics cycle of critical business decision making by nearly 400 percent. Because we can accelerate the decision making cycle, we enable our customers to answer more questions, make more decisions, and positively effect critical business outcomes. With Axellio, we are able to discover and exploit unknown relationships, identify key business drivers, generate predictions and postulate hypotheses that help drive effective, efficient decision-making. Axellio helps us provide our customers deeper, richer insight, in a shorter amount of time, than we could using traditional infrastructure solutions.”