How Artificial Intelligence Can Affect Cloud Computing

Cloud computing is massive frontier in its own right. Online activity has been redefined by the cloud, and it will come as no surprise that both the public and private sectors have eagerly adopted cloud data storage. But, like the fabled Wild West of yore, this technology is still being developed, with unknown changes over every horizon.

One of the ways that cloud computing is projected to grow is through artificial intelligence, another technology that has garnered recent attention. Many companies have posited that a fusion between AI and cloud technology has the capacity to usher in a new age of cloud computing. Recently, Aaron Levie, CEO of Box Inc., spoke about the company’s recent successes and mentioned that artificial intelligence would play a key role in their upcoming developments.

“We think that AI is going to be substantially powerful for the future of work, and we want to make sure we’re embedding intelligent experiences into everything we do and everything we build at Box,” said Levie.

Learning AI perhaps have the greatest potential to shake up the cloud sector. Machine learning has revolutionized the way that AI communicate with humans, respond to commands, and solve problems. This has manifested itself in a way that many already take for granted—voice-activated assistants such as Siri or Cortana. Cloud computer could potentially work in conjunction to help AI make better decisions and have access to more data. As an AI processes information from cloud servers, it can learn based on patterns that it finds and even send new data back into the cloud to help other machines learn as well.

Even beyond communication, other cognitive services such as image analysis can benefit from having a cloud’s worth of data to draw from. These improvements can advance security technologies with facial recognition software, and assist customers in a way that feels more in line with a human operator.

While cloud learning platforms, such as Google Cloud Machine Learning, provide the first steps towards the synergy of AI and cloud, but lack the deep learning frameworks that may be necessary for future developments. The next generation cloud computing model may be the AI First Cloud, as it is dubbed by CIO. Here, deep learning applications can be better implemented and run on a worldwide AI framework. Additionally, operational management tools could be revamped to allow for better control over the AI programs involved in infrastructure.

Ironically, future work with AI will require a lot of human input. While AI can process massive amounts of data through the cloud, it still makes mistakes, requiring data technicians to note errors and allow them to improve in the future. For example, IBM Watson, a popular AI cloud service intended to fight cybercrime, needs human input to learn about how to tackle different types of attacks. As cybersecurity is always evolving, it takes the efforts of numerous research to prepare Watson to do his job effectively.

The bottom line is that AI and cloud services are very synergistic—AI are able to process large amounts of cloud data, and the cloud becomes more useful when AI can complete tasks by request. An intelligent services revolution is on the horizon, and more experiments in cloud-based AI can help get there.

The Gap Between Cloud Demand and Delivery

With IT infrastructure changing due to the rise of cloud services, many enterprises have tried their hardest to get on top of these trends. They’d certainly be well-advised to do so; cloud services are ultimately more cost-effective, scalable, and flexible than traditional server-based computing. However, this sudden influx of demand has left IT and cloud experts struggling to keep up.

Four out of five enterprises have experienced unplanned downtime in this transitionary period, with it becoming more of a problem within the last twelve months. Beyond this, other issues, including cyber attacks, natural disasters, and infrastructure problems have put the hurt on enterprises interested in transforming their IT services.

According to a recent poll from Veeam, a disaster recovery firm, around two-thirds of UK-based IT professionals reported that downtime has hurt critical efforts, limiting digital growth over time. Even beyond this, unplanned downtime can damage revenue or brand image and cause incidents when dealing with customers or clients.

The bottom line? IT services are hard-pressed to meet growing business demands, particularly when it comes to data backup and recovery. The interested observer may think of these as growing pains, but until they’re solved, further losses can be expected. For digital growth to continue unabated, better security and backup services are necessary for distributed computing. As a relatively new technology, there’s something of a dearth of experts, leaving many veterans of the industry scrambling to pick up the new skills necessary to work effectively with the cloud. It makes sense; leveraging the cloud effectively involves a new kind of thinking that has not been used by IT professionals.

For the errant wannabe cloud expert, there are a quite a few resources available to get yourself the knowledge necessary to help address the current need for cloud professionals. Of course, a simple Google search will fetch knowledge on any aspect of the subject, but online learning outlets have never been a better place to learn.

For instance, the National Institute of Standards and Technology publishes a great series of guides on the minutiae of cloud computing. For a more structured approach, codeacademy.com offers lessons on the subject as well. However, a lot of the information found online can be a bit overwhelming, so feel free to visit cloudpatterns.org for a better look at industry trends. Maintained by industry professionals, the site compiles patterns in skill and experience among individuals getting hired in cloud positions, and is a great way to stay informed on what’s relevant.

Cloud computing is going to continue to grow as an industry, to the point where cloud services are expected to comprise 46% of IT spending by 2019. In this time, private IT services will become increasingly popular as the presence of hybrid models eases businesses into this new model.

But for now, what the industry needs is highly motivated individuals willing to keep up with current trends and assure that this continued transitional period goes smoothly for all businesses involved.

Improving Research with the Cloud

When it comes to fighting disease, it can often feel like an uphill battle that consumes funds and lives without end. While treatment options have improved greatly for diseases such as Parkinson’s, cancer, and heart disease, research methods associated with them have not and have remained largely stagnant over the past few decades.

Now, cloud technology may provide doctors and researchers with better, more consistent controls when it comes to collecting, measuring, and analyzing data. I’ve already discussed the ways that the cloud can assist with medical maintenance and improve patient outcomes through Big Data, but when applied to research, it offers consistency and an easy exchange of ideas.

Cloud computing is a prime example of disruptive technology, and nowhere else is it becoming more apparent than in the medical field, where integration with wearable devices offers access to a wealth of biometric readings in conjunction with easily-shared patient data, further blurring the line between treatment and research. Before this, data was often collected solely by doctors during patient visits, and inconsistent reporting measures majorly hampered the reach and quality of the collected data. The Michael J. Fox Foundation for Parkinson’s Research has tried to leave these practices behind and embrace cloud computing in an attempt to better understand and eventually cure the disease. With Big Data, the Foundation is able to get a better picture of the progression of the disease through patient wearables. Smartwatches allow researchers to monitor a number of biometric readings, including tremor symptoms and sleep patterns.

The Foundation’s CEO, Todd Sherer, Ph.D., remarked on the lack of progress in measuring Parkinson’s, stating that it measurement methods are “largely the same” as they were in 1817, when the disease was first described. In addition to the wearable devices that the Foundation uses, patients are capable of entering notes about their daily symptoms on a web portal as a more consistent substitute for reporting directly for a doctor. The information is then collected on Cloudera CDH, a secure data platform.

Parkinson’s isn’t the only disease that is being tracked through cloud services. The American Heart Association recently partnered with Amazon Web Services (AWS) to create a cloud system to empower scientists and researchers from all over the world to share data with each other. Previous data that has remained unseen for years is now being distributed on the cloud, to the mutual benefit of all organizations involved. While this system requires the people involved to sacrifice some level of self-benefit to provide progress toward a major problem facing humanity, enough researchers have stepped up to make the AHA’s new project a major step forward.

However, the cloud does still offer solutions for researchers interested in preserving their intellectual property. Hybrid clouds, another platform that I’ve discussed before, enable individuals to share data on a public cloud while storing their own progress on a private cloud. Because of this, others are still able to benefit from their research while they pioneer their own findings.

Not only is the cloud changing the way that data is collected and analyzed, it changes the type of data that can be measured. Compilation of genetic data is much more manageable than before thanks to better ways to sift through data, enabling researchers to better compare genomes and find subtle genetic trends that would otherwise take countless hours to detect.

Collecting a large amount of data to leverage through the cloud is part of the ongoing efforts of many research organizations; part of the ambitious Cancer Moonshot initiative is establishing a “data ecosystem” aimed at sharing and analyzing patient data on a national level.

It is exciting to see the cloud being used for medical goals in addition to business goals. From consumer trends to genetic trends, expect to see measurement metrics and data analysis improve substantially over the next few years as this disruptive technology rapidly becomes the norm in research.

Cloud Computing in Rural India

Cloud computing has been hailed unanimously as a positive development for the healthcare industry. Thanks to the cloud, medical professionals can easily track, store, and analyze data, improving both research and patient care.

Perhaps one of the most notable applications of new cloud technology is the enhanced capability to provide care to patients in remote areas. This may sound counterintuitive, as most cloud systems are dependent on a robust Internet connection to operate at their fullest, but a health technology company has created a cloud platform capable of operating at low bandwidth, addressing some of the hurdles faced when implementing these systems.

While India offers free healthcare to its citizens, its system is far from perfect, as the 70% of citizens that live in remote villages are often unable to receive attention as readily as their urban counterparts. To attempt to bring some balance to the healthcare market in the country, Sameer Sawarkar and Rajeev Kumar recently released ReMeDi (Remote Medical Diagnostics) to allow individuals with little training to represent doctors in remote areas.

Kumar and Sawarkar’s company, Neurosynaptic Communications, has dispatched medical technicians to over 2,000 villages across India to provide on-site examinations. The technicians use ReMeDi to gather basic information about patients, entering it all into a cloud-based medical repository. Patient information is then reviewed by an offsite doctor, who can provide recommendations and diagnoses based on the provided data. Additionally, an audio/video connection allows for personal consultations between doctors and patients.

And, as with any other cloud system, it gets better the more data is gathered. Studying health trends based on Big Data can help detect early signs of certain ailments and lead to administration of potentially lifesaving care. The integration that cloud computing provides is helpful not only for managing a large number of patients, but for fueling future research and innovation.

Other diagnostic platforms, such as Sana, Intelehealth, and CliniPAK, seek to solve similar issues as ReMeDi. Sawarkar has been ardent about the potential changes that ReMeDi can make to the Indian healthcare system, with the stated goal of lowering costs for companies and enabling more rapid delivery of health services, something that ultimately benefits patients. To that end, he has stated that the basic cost for a consultation under this new platform is under a dollar.

India’s healthcare system does little to service outlying areas, and as a result, many residents of remote villages seek out private healthcare, which can be costly for families and individuals with little income. 80% of Indians use private healthcare, and costs to travel and receive medical attention are financially crippling to many. This is the root of Sawarkar’s excitement with ReMeDi; what would require hours or even days of travel and weeks worth of wages can now be accomplished in the villages that need attention.

Sawarkar has strived to make ReMeDi as flexible as possible, and the platform is capable of running on a wide range of devices. With the market for cloud-based healthcare systems growing, Sawarkar intends to expand distribution of the platform in India as well as developing countries without decent access to healthcare.

Currently, ReMeDi is used in eight countries, including Bangladesh, Ghana, and Kenya. With African countries facing similar issues with disadvantaged citizens in remote areas, it stands to reason that their governments will be similarly interested in improving healthcare distribution.

Street Smarts—Pioneering the Smart City

Internet of Things (IoT) and cloud technology are two of modern business’ biggest disruptors because of the extent to which they integrate company data architecture and logistic management. Though businesses have heavily adopted this technology, it is starting to gain ground on a greater level.

Enter the smart city. Invisibly, amongst skyscrapers and heavy traffic, devices exchange data, creating a detailed and complicated picture of city life. Any aspect of a city’s infrastructure can be monitored, be it water usage, traffic patterns, information systems, and waste management.

If this sounds a bit Orwellian to you, don’t worry—we’re far from installing surveillance on every street corner. Still, individuals have raised concerns about privacy as it relates to smart cities, an expected development when considering the similar controversy surrounding Big Data.

However, it’s hard to, at this juncture, define what a smart city is. Several “flagship cities” have already claimed the title as a result of their integration of IoT technology into the city’s infrastructure. Across the board, most of these cities have similar goals—to better manage factors such as traffic, energy consumption, and crime to create a better environment for their citizens.

Based on the widespread adoption of this technology, it’s hard to set a threshold for what qualifies as a smart city. Most modern cities have monitoring devices tied into their infrastructure in some way, but truly smart cities have cross-platform integration, where multiple facets of management are tied into a single massive network.

Another important way to define smart cities is their potential to benefit citizens. While monitoring city infrastructure may sound like it only benefits local governments, it should be noted that the implementation of citywide networks is ideally completed to improve the lives of those living there.

For instance, Madrid’s smart city initiative, known as the Smarter Madrid Platform, is dedicated to studying social issues and then using technology to help solve these problems. Other than the applications already discussed, Madrid also seeks to assist startup companies throughout the city.

There are certainly ways that a smart city can go wrong. As previously mentioned, privacy concerns may hamper progress, and for good reason. Some smart cities have opted to make all data collected publicly available, and this level of transparency may have to become a standard as the idea develops further.

So where can smart cities go from here?

The ideal is to use technology to create a safer, more automated lifestyle. This can be accomplished in a number of ways, but the general consensus seems to be that linking household technology to large scale city technology is the most solid start. Many aspects of life, including personal agendas, power efficiency, and financial management can be improved with cloud networks.

Better communications lead to more efficient offices, which in turn begets employee happiness and encourages new ideas. Waste can be minimized. I could spend an entire post listing the applications of smart city technology, and it would sound like a compilation of science fiction ideas. The result, however, is perhaps not utopian cities of glittering spires, but cities much like our own, albeit with a higher standard of living that civilization has always strived for.