If you’re looking for faster WiFi you want 802.11ac — it’s as simple as that. In essence, 802.11ac is a supercharged version of 802.11n (the current WiFi standard that your smartphone and laptop probably use), offering link speeds ranging from 433 megabits-per-second (Mbps), all the way through to several gigabits per second. To achieve speeds that are dozens of times faster than 802.11n, 802.11ac works exclusively in the 5GHz band, uses a ton of bandwidth (80 or 160MHz), operates in up to eight spatial streams (MIMO), and employs a kind of technology called beamforming. For more details on what 802.11ac is, and how it will eventually replace wired gigabit Ethernet networking at home and in the office, read on.
How 802.11ac works
Years ago, 802.11n introduced some exciting technologies that brought massive speed boosts over 802.11b and g. 802.11ac does something similar compared with 802.11n. For example, whereas 802.11n had support for four spatial streams (4×4 MIMO) and a channel width of 40MHz, 802.11ac can utilize eight spatial streams and has channels up to 80MHz wide — which can then be combined to make 160MHz channels. Even if everything else remained the same (and it doesn’t), this means 802.11ac has 8x160MHz of spectral bandwidth to play with, vs. 4x40MHz — a huge difference that allows it to squeeze vast amounts of data across the airwaves.
To boost throughput further, 802.11ac also introduces 256-QAM modulation (up from 64-QAM in 802.11n), which basically squeezes 256 different signals over the same frequency by shifting and twisting each into a slightly different phase. In theory, that quadruples the spectral efficiency of 802.11ac over 802.11n. Spectral efficiency is a measure of how well a given wireless protocol or multiplexing technique uses the bandwidth available to it. In the 5GHz band, where channels are fairly wide (20MHz+), spectral efficiency isn’t so important. In cellular bands, though, channels are often only 5MHz wide, which makes spectral efficiency very important.
802.11ac also introduces standardized beamforming (802.11n had it, but it wasn’t standardized, which made interoperability an issue). Beamforming is essentially transmitting radio signals in such a way that they’re directed at a specific device. This can increase overall throughput and make it more consistent, as well as reduce power consumption. Beamforming can be done with smart antennae that physically move to track the device, or by modulating the amplitude and phase of the signals so that they destructively interfere with each other, leaving just a narrow, not-interfered-with beam. 802.11n uses this second method, which can be implemented by both routers and mobile devices. Finally, 802.11ac, like 802.11 versions before it, is fully backwards compatible with 802.11n and 802.11g — so you can buy an 802.11ac router today, and it should work just fine with your older WiFi devices.
The range of 802.11ac
In theory, on the 5GHz band and using beamforming, 802.11ac should have the same or better range than 802.11n (without beamforming). The 5GHz band, thanks to less penetration power, doesn’t have quite the same range as 2.4GHz (802.11b/g). But that’s the trade-off we have to make: There simply isn’t enough spectral bandwidth in the massively overused 2.4GHz band to allow for 802.11ac’s gigabit-level speeds. As long as your router is well-positioned, or you have multiple routers, it shouldn’t matter a huge amount. As always, the more important factor will likely be the transmission power of your devices, and the quality of their antennae.
How fast is 802.11ac?
And finally, the question everyone wants to know: Just how fast is WiFi 802.11ac? As always, there are two answers: the theoretical max speed that can be achieved in the lab, and the practical maximum speed that you’ll most likely receive at home in the real world, surrounded by lots of signal-attenuating obstacles.
The theoretical max speed of 802.11ac is eight 160MHz 256-QAM channels, each of which are capable of 866.7Mbps — a grand total of 6,933Mbps, or just shy of 7Gbps. That’s a transfer rate of 900 megabytes per second — more than you can squeeze down a SATA 3 link. In the real world, thanks to channel contention, you probably won’t get more than two or three 160MHz channels, so the max speed comes down to somewhere between 1.7Gbps and 2.5Gbps. Compare this with 802.11n’s max theoretical speed, which is 600Mbps.
The future of 802.11ac
802.11ac will only get faster, too. As we mentioned earlier, the theoretical max speed of 802.11ac is just shy of 7Gbps — and while you’ll never hit that in a real-world scenario, we wouldn’t be surprised to see link speeds of 2Gbps or more in the next few years. At 2Gbps, you’ll get a transfer rate of 256MB/sec, and suddenly Ethernet serves less and less purpose if that happens. To reach such speeds, though, chipset and device makers will have to suss out how to implement four or more 802.11ac streams, both in terms of software and hardware.
Sharing sensitive information with people that are supposed to have it, while at the same time keeping it from people that aren’t, has been one of the toughest problems facing white-collar workers for as long as anyone can remember. Since the 1960’s, the concept of a having a multilevel security system (MLS) in place to outline the ‘need to know’ matrix for controlling use of sensitive data has been considered a must-have process. Under an MLS structure, both people and information are classified into different levels of clearance (people) and sensitivity (information). As a result, data classification schemes such as “Public”, “Internal Use Only”, “Confidential”, “Secret”, and “Top Secret”, along with restricted access to those levels based on clearance, have become the baseline of most world-class information security policies.
According to an MLS-based security policy, before users are allowed to look at classified information they must have the right clearance to enable them to use that sensitive data. As an example, users with a “Confidential” clearance are authorized to see documents classified as “Confidential”, but they can’t see/use “Secret” or “Top Secret” information (just like any outsider wouldn’t be able to do so without clearance).
To make this paradigm accessible to virtually any organization, RightsWATCH delivers the complete data-centric information security spectrum of dynamically identifying sensitive/confidential information, classifying it into the right level (according to policy), marking/tagging that information, and encrypting it with the world-class encryption technology in Microsoft’s Azure Rights Management Services so that only those who have express authority to use that information can do so. All without user involvement required. With RightsWATCH and Azure RMS, even if sensitive data is somehow leaked, it is totally un-useable by any unauthorized parties into whose hands it may fall.
RightsWATCH enforces your custom MLS data classification model, providing access to data based on its level of sensitivity matched with a user’s credentials to ensure a “need-to-know” basis for sensitive data. With this approach, you are assured of protecting data privacy and achieving regulatory compliance. RightsWATCH delivers the ability to grant or revoke each user’s access with multiple security clearances at a given moment in time or based on a specific role performed.
Since each organization is different in its MLS definition and approach, RightsWATCH enables customized and granular definition of your MLS, classifying data not only into levels of sensitivity but also segmenting access by ‘scope of reference’ such as by department (HR, Finance, R&D, etc.) or by project (M&A, product launch, etc.). With RightsWATCH, the organization has the ability to define multiple levels of segmentation of classification such as “Scope” and “Level” in establishing a rich, automated classification system. Users are then granted role-based access to the company, scope, and levels of information appropriate to their functions.
Leveraging data classification and information rights management in a single solution, RightsWATCH keeps sensitive data safe and secure independent of its state: at rest, in motion, or in use. Further, sensitive data is always protected, even if it exists totally outside your ‘secure’ network perimeter.
OryxAlign has been chosen as the new technology support partners for Natural Balance Foods, the makers of Nakd & Trek bars. Natural Balance Foods are a British company devoted to increasing world happiness with yummy healthy snacks, humour and helpfulness. That means they make delicious, good-for-you munchies, do our best to help others, and try to spread a little joy along the way. Their range of healthy snacks and bars are a fantastic choice for vegans and those who want to look good, feel good and do good.
OryxAlign have been on-boarded to work closely with the operations team at Natural Balance to advise and facilitate planned expansion into international markets as well as the continued development of the IT and application infrastructure. OryxAlign have a proved track record within the FMCG and retail sector in helping brands such as Gu, Ella’s Kitchen, Divine Chocolate, Up&Go, grow, develop and prosper.
I’m hugely excited by the opportunity of working with this great brand and taking their operation to the next level. Our technology services, platforms & consultancy are perfectly placed to develop a long lasting partnership.
Carl Henriksen, OryxAlign Founder & CEO
As cloud continues to be mainstream, the question amongst progressive IT departments when rolling out new applications is shifting from “why cloud?” to “which cloud?” Currently, the public cloud is dwarfed by private and on-premise solutions, with VMware CEO Pat Gelsinger stating that the latter represents over 90 per cent of the total business. Furthermore, Gartner has estimated that by 2020, on-premise cloud will account for 70 per cent of the total market.
There are many good reasons for organisations still opting to host their applications on hardware owned and managed in-house. Most organisations still have large investments in technology, people, and processes that cannot simply be written-off; certain workloads still do not suit virtualised or multi-tenanted platforms; renting resources is not always cheaper or better than owning them; and there are valid security and compliance reasons for keeping certain data on premise.
In spite of these concerns, however, the public cloud continues to grow at a ferocious rate, validating the benefits that this infrastructure delivery model offers; that certain data and workloads are better suited for a private cloud infrastructure or for a physical hosted platform therefore seems to be the caveat that opens the door to hybrid solutions. A hybrid solution gives organisations the option of scaling resources for specific workloads and running applications on the most appropriate platform for a particular given task. A highly dynamic application with varying spikes may be best supported in the public cloud, a performance-intensive application may be better suited running from the private cloud and applications with high regulatory requirements may need to reside on a physical platform. Furthermore, a hybrid solution allows an organisation to place their data where compliance or security requirements dictate. This is significant as 59 per cent of UK IT professionals surveyed by NaviSite still cite security as their main concern with cloud migration. Once a business has decided on a hybrid model, however, there is still the task of ‘getting there.’
With a hybrid solution, it’s important to audit the systems already in place and to optimise the hybrid configuration to ensure that the right resource is matched with each workload. Businesses should plan to optimise their environment by starting small and then scale up. By starting a hybrid project with a small pilot allows the IT department to get comfortable with the ins and outs of the hybrid model before rolling it out further across the organisation.
For a successful implementation of a hybrid model, organisations should remain aware of the following:
Hybrid continues to grow as it is the solution that offers organisations the best of both worlds. By starting small and being aware of the implementation challenges, IT leaders can successfully implement a hybrid strategy that pragmatically embraces the new, whilst making best use of current-state. By going hybrid, today’s IT leaders can pick the best-fit strategy for the current demands of their business, within a flexible framework that will enable them to manage future change.