Mind The Skill Gap: How To Expand Talent In Cybersecurity

The modern cybersecurity industry has been the subject of much attention in recent years as threats to businesses large and small continue to mount. Even outside of the industry, business experts have correctly concluded that more measures are necessary to counteract aging infrastructure and an increase in possible attack surfaces. The interest exists, but what doesn’t exist is enough personnel to fill the myriad jobs in cybersecurity. And this problem is only expected to get worse as time goes on, with the number of new professionals eclipsed by how much their talents are needed.

The discrepancy in cybersecurity is often blamed on the failure of universities to provide the courses necessary to train the next generation of experts. While this is a noted problem, it is only one of many. The same cybersecurity professionals who scoff at formalized education in the field are often unwilling to invest the resources into training others within their organization, instead preferring to solicit existing talent from other businesses. Though there is significant work involved when it comes to building a reliable team, the reward of cultivating new talent is preferable to poaching the old and widening the skill gap further. It’s a sacrifice, but one that must be made if the industry is to be sustainable moving forward.

The other advantage of in-house training is the integration of cybersecurity practices throughout all of a business’s processes. Too often, organizations view these tools as afterthoughts to be stapled at the end of every project. The reality here is that new systems, products, and infrastructure should be created with cybersecurity in mind. Training and apprenticeship programs provide enough personnel to adequately address anything new coming through the pipeline. This carries the added benefit of allowing new entrants in the industry to receive a hands-on experience with a variety of systems.

And, if a company doesn’t have any place to start when it comes to building in-house cybersecurity firms, a third-party organization can help. These companies can provide support for companies that don’t have the resources to train new experts and build the start of a good security culture. However, even dedicated cybersecurity organizations should be mindful of their practices when it comes to recruiting vs. fostering talent.

It’s also important to consider the kind of training that is being given to aspiring cybersecurity experts. The best way to handle cybersecurity is to start with the broad strokes—the compliance laws that it is absolutely vital that any professional knows. In many cybersecurity courses, emphasis is placed on the products that businesses can use to combat threats. This leads to experts with very specialized knowledge of a specific solution, rather than knowing many brand-agnostic solutions or the compliance standards that underpin the whole industry. Consultants should be neutral when it comes to recommending solutions and find whatever suits an organization’s needs.

The future of cybersecurity will need to be collaborative. Between academic organizations selling their programs to business organizations offering opportunities for interested professionals to learn, a lot needs to change about how new talent is cultivated. In the future, expect to see a new generation of experts that know compliance law inside and out—and that are focused on spreading their best practices to others.

 

 

Why Cybersecurity is Important For Small Businesses

Most small businesses think themselves beneath notice in the larger corporate world. After all, without the need for IT systems or significant infrastructure, there is less to manage and fewer vulnerabilities to address. Anything related to technology is often an afterthought for small businesses, and many make the mistake of believing that cyber threats are not a potential issue.

The unfortunate reality is that cybercriminals are more than willing to prey on small- to mid-sized businesses, even with bigger targets available. Widespread adoption of technology has made just about every enterprise a potential target. Given the lack of focus on technological infrastructure and fewer resources to dedicate to systems monitoring or recovery after a breach, these organizations are especially vulnerable. And every company has data worth stealing, to sell or use as leverage. The loss of any sensitive information can cost a company in both reputation and capital.

For small businesses, a bit of investment in cybersecurity can keep records safe and ensure that larger losses don’t happen down the line. There’s no need for a dedicated IT team to take necessary precautions when it comes to adopting new technology or infrastructure. The first step is acknowledging that, while small businesses are vulnerable, they are not without options for their own protection.

Many cybersecurity breaches are the result of internal error—I can’t emphasize this enough. As a result, controlling for human mistakes such as weak passwords, clicking on risky emails, and using mobile devices on unsecured networks can go a long way. Many do not realize how many points of vulnerability exist. Small businesses should ensure that the tools in place are easy to use for employees that may not be familiar with these matters.

Mobile devices are such a massive point of vulnerability that it’s worth dedicating time to examine all of the ways that they can go wrong. Between the difficulty inherent in managing them, the risk of public wifi, and employees bringing devices from home, small businesses will have to account for every possible attack surface. Consistent regulation is necessary to ensure that personal and business devices stay safe no matter where they go.

For that matter, small businesses should consider regulating access to certain systems and technology. Though they may not have an IT department or dedicated standards for who can access what, these organizations should consider which systems each employee has a consistent need for. If an employee doesn’t require a system to get their work done, they should not have access.

Planning for a cyberattack should also account for the worst-case scenario of a breach occurring and ease the recovery process. Making backups of everything digital is the best and easiest way that a small business can protect themselves in the event of a breach and allow for the least downtime when something goes wrong. The investment to create on- and off-site backups is minimal, but the safety it provides is huge.

This is only a small sample of the tactics that small businesses can consider when investing in cybersecurity. VPNs, software audits, and proven antivirus software can also provide an additional line of defense. However, any small business should recognize that precautions do not guarantee safety, and may want to consider investing in the services of a third-party cybersecurity firm to assess risk levels and provide scalable solutions. Technology will become even more involved in business, and safe adoption is important for businesses of all sizes.

 

How Artificial Intelligence Can Affect Cloud Computing

Cloud computing is massive frontier in its own right. Online activity has been redefined by the cloud, and it will come as no surprise that both the public and private sectors have eagerly adopted cloud data storage. But, like the fabled Wild West of yore, this technology is still being developed, with unknown changes over every horizon.

One of the ways that cloud computing is projected to grow is through artificial intelligence, another technology that has garnered recent attention. Many companies have posited that a fusion between AI and cloud technology has the capacity to usher in a new age of cloud computing. Recently, Aaron Levie, CEO of Box Inc., spoke about the company’s recent successes and mentioned that artificial intelligence would play a key role in their upcoming developments.

“We think that AI is going to be substantially powerful for the future of work, and we want to make sure we’re embedding intelligent experiences into everything we do and everything we build at Box,” said Levie.

Learning AI perhaps have the greatest potential to shake up the cloud sector. Machine learning has revolutionized the way that AI communicate with humans, respond to commands, and solve problems. This has manifested itself in a way that many already take for granted—voice-activated assistants such as Siri or Cortana. Cloud computer could potentially work in conjunction to help AI make better decisions and have access to more data. As an AI processes information from cloud servers, it can learn based on patterns that it finds and even send new data back into the cloud to help other machines learn as well.

Even beyond communication, other cognitive services such as image analysis can benefit from having a cloud’s worth of data to draw from. These improvements can advance security technologies with facial recognition software, and assist customers in a way that feels more in line with a human operator.

While cloud learning platforms, such as Google Cloud Machine Learning, provide the first steps towards the synergy of AI and cloud, but lack the deep learning frameworks that may be necessary for future developments. The next generation cloud computing model may be the AI First Cloud, as it is dubbed by CIO. Here, deep learning applications can be better implemented and run on a worldwide AI framework. Additionally, operational management tools could be revamped to allow for better control over the AI programs involved in infrastructure.

Ironically, future work with AI will require a lot of human input. While AI can process massive amounts of data through the cloud, it still makes mistakes, requiring data technicians to note errors and allow them to improve in the future. For example, IBM Watson, a popular AI cloud service intended to fight cybercrime, needs human input to learn about how to tackle different types of attacks. As cybersecurity is always evolving, it takes the efforts of numerous research to prepare Watson to do his job effectively.

The bottom line is that AI and cloud services are very synergistic—AI are able to process large amounts of cloud data, and the cloud becomes more useful when AI can complete tasks by request. An intelligent services revolution is on the horizon, and more experiments in cloud-based AI can help get there.

Forecast of Fog: How Fog Computing Could Make an Impact

When it comes to the Internet of Things, connectivity is the name of the game.

Connectivity is, in fact, the appeal of a related technology that could represent another evolution in the same vein as the cloud. In fact, it even has a similar weather-based name to go along with it.

In terms of purpose, this new concept, dubbed “fog computing,” is essentially identical to the cloud; it is still intended to allow users to store and retrieve files without the need for a costly server infrastructure. However, the difference lies in the way the data is handled. Fog computing creates a “fog” of data; scattered packets that don’t contain the whole of any file.

This approach makes it much more difficult for attackers to steal information. Any attempts to retrieve files illegitimately only allow for access to one part of the data, giving attackers a garbled, incomplete mess. This is because, unlike cloud computing, fog computing involves a network of smaller servers, called fog nodes, that distribute packets of data. Nowhere in the fog network is any file stored in its entirety.

The other advantage of fog computing is that the decentralized fog nodes enable data to be closer to the businesses that need to access it. The disadvantage of the cloud is that data still has to travel from servers to wherever it’s needed, reducing performance. Fog computing bypasses some of these concerns and enables faster data retrieval. In fog computing, data centers are smaller, easier to place, and ideally as widespread as possible.

In fact, some fog nodes are so small that they can be placed anywhere that they can connect to the network. Power poles, vehicles, and even oil rigs can host fog nodes and widen the network. Ideally, fog-enabled devices are situated at the edge, right where data is needed immediately. With this larger, less centralized network, doors open to monitor new types of data, particularly as it relates to machine to machine communications. Fog computing allows data to be analyzed and acted upon in seconds.

It’s the ideal marriage of the cloud and the Internet of Things, two concepts that have gained a lot of ground among technology and data experts in the last several years. Cloud computing, though certainly a disruptive technology, has faced scrutiny for the security issues associated with sending raw data over the Internet, particularly if sensitive data is subject to regulation. Its fog-based counterpart offers an approximation of cloud capabilities with the accessibility of IoT devices.

However, the cloud is not going away anytime soon. Fog computing is a great complement to the cloud, but does not process massive amounts of historical data the same way that the cloud does. Plus, with fog computing being an emerging technology, its capabilities may not surpass the cloud in many cases. For now, it offers businesses the opportunity to improve their data infrastructure without replacing any cloud services they might already be using.

In the future, we will see fog computing delivering valuable data that can help make business and technology more efficient and interconnected. The OpenFog Consortium is a collaboration between a number of high profile organizations, including Intel, Cisco, and Microsoft, with the intent of promoting widespread fog computing usage. After all, like other IoT devices, fog computing becomes much more effective the more devices that are integrated into the network.

Improving Research with the Cloud

When it comes to fighting disease, it can often feel like an uphill battle that consumes funds and lives without end. While treatment options have improved greatly for diseases such as Parkinson’s, cancer, and heart disease, research methods associated with them have not and have remained largely stagnant over the past few decades.

Now, cloud technology may provide doctors and researchers with better, more consistent controls when it comes to collecting, measuring, and analyzing data. I’ve already discussed the ways that the cloud can assist with medical maintenance and improve patient outcomes through Big Data, but when applied to research, it offers consistency and an easy exchange of ideas.

Cloud computing is a prime example of disruptive technology, and nowhere else is it becoming more apparent than in the medical field, where integration with wearable devices offers access to a wealth of biometric readings in conjunction with easily-shared patient data, further blurring the line between treatment and research. Before this, data was often collected solely by doctors during patient visits, and inconsistent reporting measures majorly hampered the reach and quality of the collected data. The Michael J. Fox Foundation for Parkinson’s Research has tried to leave these practices behind and embrace cloud computing in an attempt to better understand and eventually cure the disease. With Big Data, the Foundation is able to get a better picture of the progression of the disease through patient wearables. Smartwatches allow researchers to monitor a number of biometric readings, including tremor symptoms and sleep patterns.

The Foundation’s CEO, Todd Sherer, Ph.D., remarked on the lack of progress in measuring Parkinson’s, stating that it measurement methods are “largely the same” as they were in 1817, when the disease was first described. In addition to the wearable devices that the Foundation uses, patients are capable of entering notes about their daily symptoms on a web portal as a more consistent substitute for reporting directly for a doctor. The information is then collected on Cloudera CDH, a secure data platform.

Parkinson’s isn’t the only disease that is being tracked through cloud services. The American Heart Association recently partnered with Amazon Web Services (AWS) to create a cloud system to empower scientists and researchers from all over the world to share data with each other. Previous data that has remained unseen for years is now being distributed on the cloud, to the mutual benefit of all organizations involved. While this system requires the people involved to sacrifice some level of self-benefit to provide progress toward a major problem facing humanity, enough researchers have stepped up to make the AHA’s new project a major step forward.

However, the cloud does still offer solutions for researchers interested in preserving their intellectual property. Hybrid clouds, another platform that I’ve discussed before, enable individuals to share data on a public cloud while storing their own progress on a private cloud. Because of this, others are still able to benefit from their research while they pioneer their own findings.

Not only is the cloud changing the way that data is collected and analyzed, it changes the type of data that can be measured. Compilation of genetic data is much more manageable than before thanks to better ways to sift through data, enabling researchers to better compare genomes and find subtle genetic trends that would otherwise take countless hours to detect.

Collecting a large amount of data to leverage through the cloud is part of the ongoing efforts of many research organizations; part of the ambitious Cancer Moonshot initiative is establishing a “data ecosystem” aimed at sharing and analyzing patient data on a national level.

It is exciting to see the cloud being used for medical goals in addition to business goals. From consumer trends to genetic trends, expect to see measurement metrics and data analysis improve substantially over the next few years as this disruptive technology rapidly becomes the norm in research.