Demand for Cloud EHR is Increasing Rapidly

Considering the changing landscape of requirements in healthcare data management, the new cloud-based Electronic Health Record (EHR) has seen a rapid growth in demand for various reasons. When Epic systems declared to acquire Mayo Clinic data center for $46 million, it instated the belief the demand for cloud EHR is increasing steadily in healthcare domain.

In 2016, the demand for cloud-based technology solutions that assist medical practitioners to deliver better care while reducing administrative burdens is expected to gain momentum.

What is cloud-based EHR?

EHR is a collection of electronic health data of individual patients or populations. It includes medical history, demographics, genetic history, medication and allergies, laboratory test results, age/weight/blood group, x-ray images, vital signs, etc. in digital format and is capable of being shared across different stakeholders.

Cloud-based EHR allows software and clinical data to be stored, shared and updated in the cloud, whereas traditional EHR systems usually allow information to users that are in the same physical location as software and servers.

Putting it in simpler words, cloud EHR allows accessing and working with data available hosted at a shared online location, rather than on a personal disk drive or local server. All software and information is stored exclusively on an online network (also known as “in the cloud”). Any authorized user having internet connection can have access to this information.

Why demand for cloud EHR has increased?

With the existing demand for cloud EHR solutions, it is expected that the market of EHR will be about $30 billion by 2020 and is only expected to grow further.

graph-01
Source: Grand View Research

This demand is primarily driven by increased need for anytime-anywhere accessible software solutions that reduce errors and increase ease of use.

geographic analysis

Legacy on-premise solutions are unable to meet the changing requirements of today’s healthcare sector. They are built on outdated client-server systems that are costly, inflexible and can’t meet the need to analyze data on real-time basis. Such issues pose a significant challenge to healthcare providers who work with complex and disconnected datasets.

As compared to traditional on-site hosted solutions, cloud computing offers benefits such as:

  • Cost Reduction

Cloud based software require less development and testing resources, which implies lesser cost for support and maintenance of applications.

  • Improved Efficiency

Cloud solutions can automate many business processes such upgrade of systems. Being able to understand the bigger picture in real time allows you to focus on your core strengths.

  • Accessibility

Users can access applications from anywhere and any device, thus breaking down barriers of geography, thereby improving the speed with which decisions need to be taken.

  • Flexibility

Cloud-based network can easily scale, accommodate and respond to a rapid increase in the number of users and spikes in demand.

  • Reliability

Cloud computing allows applications to run independent of hardware through a virtual environment running out of secure data centers.

Today’s technological capabilities have made it possible to make health records more attractive to end users. Cloud based EHR solutions with visually appealing interfaces and innovative methods of interpretation, analysis and presentation of health records has been successful in improving the doctor-patient relationship.

Related post from 8KMiles

Top Health IT Issues You Should Be Aware Of

How Cloud Computing Can Address Healthcare Industry Challenges

How pharmaceuticals are securely embracing the cloud

5 Reasons Why Pharmaceutical Company Needs to Migrate to the Cloud

6 Reasons why Genomics should move to Cloud

AAEAAQAAAAAAAAbMAAAAJGVkYWJjYzI1LTc5ODktNGNiMC1hYTAyLWE5ZTFmYWU3OThkZQ

In the exciting, dynamic world of Genomics**, one witnesses path-breaking discoveries being made every day. On a mission to empower Pharmaceutical and Health Care industries with deeper understanding of the genome*, their activities and likelihoods of mutation, research activities in Genomics generate massive amounts of very important, significant data.

Research in Genomics churns out solutions, which means, a vast amount of useful information, with which identification, treatment, and prevention of numerous diseases and disorders could be realized with improved efficiency. Now, think about advanced Gene therapy and molecular medicine!

This enormous range of data and information needs a system that is not just capable of handling the colossal data load but also preserve the same with highsecurity and managed accessibility options.

  1. Large-scale Genome sequencing, Comparative genomics andpharmacogenomics require storage and processing of enormous volumes of data, to derive valuable insights that facilitate gene-mapping, diagnosis and drug discovery.
  2. The exhaustive genome database on a perpetual expansion mode simply exceeds the capacity of existing on-premise data storage facilities.
  3. In addition, the research-intensive domain requires managed services for user governance, access management and data encryption, which require synchronized efficiencies and compatibility with multiple systems that comply with international best practices and standard protocols.
  4. Cloud Architecture, empowered by scalability and elastic efficiencies, provides virtual storage spaces for the expansive genome database, with assisted migration, accessibility and security implementation in place.
  5. Large scale data-processing, storage, simulation and computation could be achieved on virtual laboratories on the cloud.
  6. Last but not the least, Cloud solutions for Genomics could be configured to fit specific research and standardized protocols requirement, rendering huge advantages, in terms of flexibility, compliance with protocols and regulatory standards, cost-savings and time efficiencies.

The leading Silicon Valley based Cloud Services firm, 8K Miles, specializes in high-performance Cloud computing solutions to Bioinformatics, Proteomics, Medical informatics and Clinical Trials for CROs, emerging as one of the top solution providers for the IT and ITIS requirement on Cloud for the Pharma, Health Care and allied Life Sciences domains.

*A Genome is the collection of the entire set of genes present in an organism.
**Genomics is the branch of science that deals with the structure, function, evolution and mapping of genomes.

  • April 20, 2016
  • blog

Enhancing patient care with well-defined Identity Access Governance services

AAEAAQAAAAAAAAbBAAAAJDE0MWFmYTE3LTkxM2UtNGZmYy05YmRiLTQ4ZjdmZmNiNTg3Yg

Richard Branson had spoken well in his recent tweet –
“…the only mission worth pursuing in business is to make people’s lives better.”

More so when it comes to Health Care IT. There prevails a strong moral responsibility in providing and protecting health care data, particularly of those enclosed in Electronic Health/Medical Records (EHRs). There are multiple opinions on the grant of primary ownership and access to these records, as the data is highly personal and hence sensitive and confidential.

Interoperability in Health Care IT is the ability of different IT systems and Apps to communicate, exchange and use data. This comes in as a boon for those who have to keep up with a change of residence/doctors/hospitals/health care providers, for there are high chances that no two places operate with the same IT infra system. With this, comes along the need for the above identities/personalities to interact in a secure manner, which could be monitored/managed in an efficient way. This is achieved by a comprehensive set of Identity & Access Governance (IAG) services.

Ideally, IAG should be designed in such a way that it effectively answers the following five questions:

  1.  Is your System built on an Anti-Hackable environment  ?
  2. Do you protect patients’ records? If yes, how?
  3. Who are the intended users of an EHR? Is the User given permission to access and own the resource? If yes, how? If no, why?
  4. Do you restrict users from accessing a particular portal, for security reasons, on justified grounds?
  5. How would you monitor employees within a health care facility, to check if s/he still has access to resources tied to her/his past role in the organization?

The HIPAA Security Rule requires that a user or entity accessing patient health information (PHI) be authenticated before such access is granted. IAG services, therefore, should implement security measures sufficient to reduce risks and vulnerabilities to a reasonable and appropriate level.

The leading Silicon Valley based Cloud Services firm, 8K Miles, effectively addresses the golden Security Rule via elaborate risk analysis and assessment, thereby helping healthcare service providers to implement reliable, real-time IAG services and solutions, be it on the cloud or on-premise data centres, emerging as one of the most trusted solution providers for the IT and ITIS requirement for the Health Care, Pharma and allied Life Sciences domains.

  • April 20, 2016
  • blog

Summary of Chennai Azure Global Bootcamp 2016

Summary of Chennai Azure Global Bootcamp 2016

The Chennai Global Azure bootcamp 2016 happened on Apr, 16th 2016 went on really well with lots of technical session and handson labs. We received more than 1000 registrations from people with variety of background, some of them were practicing professionals, few from IT Pro background and lots of students who are aspiring to be a cloud professionals soon.
The event started at 10.00 am and the keynote was delivered by Balaji Uppili, Chief Delivery Officer ,GAVS. He gave a lightning talk on the current cloud landscape and how azure is leading the game also touched little upon how developers must equip themselves to stay relevant in the ever changing IT. Soon after the keynote, presenters started offering session in 2 tracks, 1. Developer 2. IT Pro

We received 10 fantastic speakers from Chennai and Bangalore who delivered various tracks on various topics including Azure Apps Service, Open Source, Big data. I delivered a topic on Azure Data lake Analytics and Data lake store, the services which are currently in preview, but the attendees were able to recognize the value of the services and how it can help developers to leverage these big data analytics services and exploit big data.

Event Highlights

 

  • Total Registrations for the event : 1000+
  • Attendees joined the session : 450
  • % of Dev/IT Pro : 75%
  • % of Student partners : 25%
  • Total no of Technical Tracks : 10
  • Hands on Lab conducted : 1

Tracks

 

9:00 TO 9:30 : RECEPTION AND REGISTRATION
Dev Track IT Pro Track
09:30 – 10:00: CHENNAI GLOBAL AZURE BOOTCAMP: KEYNOTE BY BALAJI
10:15 – 11:00: Building Mobile Apps with Visual Studio and Azure Services 10:15 – 11:00: Power of Open Source on Azure (Ruby on Rails)
11:00 – 11:15: Café
11:30 – 12:15: Deep into Azure Machine Learning & Predictive Analytics 11:30 – 12:15: Running LINUX workload on Azure
12:30 – 13:15: 14:00 – 14:45: DevOps In Azure 12:30 – 13:15: Kick off your project for success!
13:15 – 14:00: LUNCH
14:00 – 14:15: Deep dive into Azure SQL Introduction to Data Lake Analytics & Store
15:00 – 15:45: IoT on Azure (Part 1) 15:00 – 15:45: Azure AD with Office 365
15:45 – 16:00: Café
16:15 – 17:00: IoT on Azure (Part 2) Azure Hands on Labs
17:00 – 17:30: AZURE GLOBAL BOOTCAMP 2016 CLOSING

 

Local Sponsors

GAVS Technologies and 8KMiles Software Services are the key local sponsors who helped us to execute such a larger event in Chennai. Infact, this is one of the largest community driven Azure event in the recent past conducted in the city. I’m very thankful for all the sponsors in helping us to execute the event.
Azure Bootcamp Sponsors

Conclusion

Overall, the event went on really well, and a lot of great content was delivered by our awesome experts and MVP speakers. Thanks to all the presenters as well as all attendees! Without you, there wouldn’t have been an event. Also, special thanks goes to the Global Azure Bootcamp team for organizing the global event and getting together the prizes from all the global sponsors.

I had a great time presenting and helping people out with the hands-on labs launching Windows and Linux VMs towards the end of the day. It was a great learning and fun experience. Currently, I’m planning to help coordinate the Chennai Global Azure Bootcamp event next year as long as god will.

Until next year, adiss amigos!

P.S. Please feel free to contact me with any question about Azure or general feedback on the event. You can either submit them to me in the comments on this post, via Twitter @ilyas_tweets or drop an email to me @ ilyas.f@8kmiles.com

 

 

  • April 20, 2016
  • blog

5 Considerations you need to know before investing in Big Data Analytics

A vast number of companies with different industrial background collaborate with data analytics companies to increase operational competence and to make better business decisions. When big data is handled properly they shall lead to immense change in a business. Though data analysis is a powerful tool, most companies are not ready to include data analysis software as their practical source. Purchasing and downloading data analytics isn’t as simple as buying a software. There are many things that must be considered before a company invests in analytics software.

You should know where your company exactly stands in terms of analysis system and consider the following things before investing in big data analytics.

What you want to find out from your data?

You should know for what you will be using your analytics software before investing on them. If you don’t know what business problem you need to solve then collecting data and setting up an analysis system isn’t productive. So check for areas of your company where the current process is not effective. Look out for different questions you need to answer prior to investing in a solution so you can adopt for an appropriate analytics partner.

Do you have enough data to work?

You should have significant and reliable data to perform data analytics. Therefore, you need to see if your company has enough amount of data or workable information to perform analysis. Also you should determine if the company can afford and have ability to collect such information. This process can become expensive considering the labor cost, hours spent on categorizing the information and data storage. So it is also necessary to consider data aggregation and storage cost before moving forward.

Do you have the capital to invest for analytics software?

Depending on companies need the price range for analytics software varies. Few software vendors offer data warehousing, which can be ideal for companies that require data storage and analytics as well as have large budget. Other vendors give visualization systems, both on SaaS and on-premise form. As visualization comes in varied price ranges, your company will be able to find a solution that fits your budget.

Besides the software cost you should estimate the cost of effort and service which is five times the software price. The investment can change depending on the size and depth of the project, but it’s necessary to completely understand the costs involved in data analytics before investing.

Do you have resource to work with your data?

There are many analytics systems that are automatic but you still need user interaction and management. It is necessary to have data engineer for constant data ingestion, organization and provision of data marts for data analysts and data scientists, who in turn will continue to work on new insights and predictions by updating the data processing rules and algorithms/models as per changing business needs. Also having resource for analytical decisions will avoid confusions and that specific person should be able to allot time and materials for scrutinizing and making reports.

Are you capable to take action?

At the final stage, you would have collected data, identified the problem, invested in the software and performed analysis; but to make everything worth you have to be ready to act immediately and efficiently. With the recently discovered data, you have required information to change your organizations practice. Whereas executing a new project could be expensive so it’s essential to be ready with resources necessary for implementing the change.

Data analytics can be a powerful tool to improve a company’s efficiency. So remember to consider these five factors before investing in big data analytics.

Powershell: Automating AWS Security Groups

To provision and manage EC2-Instances in AWS cloud that comply with industry standards and regulations, Individuals administrating that should understand the security mechanisms within AWS framework—both those that are automatic and those that require configuration. Let’s take a look at Security Group which falls under the latter category.

As there is no “Absolute Security Group” which can be plugged in to satisfy the universal need, we should always be open for its modification. Automating so via Powershell will provide predictable/consistent results.

What Is Security Group?

Every VM created through AWS Management Console (or via scripts) can have association with one or multiple Security Groups (in case of VPC it can be up to 5). By default all the inbound and out bound traffic flow at instance level is blocked from elsewhere. We should automate the infrastructure to open only the ports satisfying the customer need. This implies that we should add rules to each Security Group for ingress/ egress as per customer requirement. For more details have a look at AWS Security Group

It is duly important to allow traffic only from valid source IP addresses; this will substantially prune security attack surface, use of 0.0.0.0/0 as IP range makes things vulnerable for sniffing or tampering of infrastructure. Traffic between VMs should always traverse through Security Groups; we can achieve this by allowing initiators Security Group- ID as source.


Automation Script


I have kept this as a single block, if one wishes they can create a function out of it. Few things worth considering:

  • Execution of this script will only materialize given working pair of Secret Key & Access Key
  • This script make use of filtering functionality, whereby it expect end user to provide some Name-Pattern ,selection of Security Group is driven by aforementioned pattern
  • To facilitate the whole operation you have to provide certain parameters i.e.[IpProtocol , FromPort , ToPort , Source]
  • Source parameter can be interpreted in two ways, you can either provide IpRanges in CIDR block format or choose another Security Group as source in the form of UserIdGroupPair

<#

.SYNOPSIS

Simple script to safely assign/revoke Ingress Rules from VPC Security Group .

 

.DESCRIPTION

Script first checks to see what are the rules has beein specified for update,if already assigned will do no harm.

If assginement is successful, same can be verified at AWS console.

 

NOTE:  Script must be updated to include proper pattern, security credentials.

#>

 

# Update the following lines, as needed:

 

Param(

[string]$AccessKeyID=”**********”,

[string]$SecretAccessKeyID=”********”,

[string]$Region=”us-east-1″,

[string]$GrpNamePattern=”*vpc-sg-pup_winC*”,

[string]$GroupId=”sg-xxxxxxxx”,

[string]$CidrIp=”0.0.0.0/0″,

[switch]$SetAws=$true,

[switch]$Revoke,

[switch]$Rdp=$true,

[switch]$MsSql=$true

)

$InfoObject = New-Object PSObject -Property @{

AccessKey = $AccessKeyID

SecretKey = $SecretAccessKeyID

Region=$Region

GrpNamePattern = $GrpNamePattern

GroupId=$GroupId

CidrIp=$CidrIp

}

if($SetAws)

{

Set-AWSCredentials -AccessKey $InfoObject.AccessKey  -SecretKey $InfoObject.SecretKey

Set-DefaultAWSRegion -Region $region

}

$PublicGroup = New-Object Amazon.EC2.Model.UserIdGroupPair

$PublicGroup.GroupId= $InfoObject.GroupId

 

$filter_platform = New-Object Amazon.EC2.Model.Filter -Property @{Name = “group-name”; Values = $InfoObject.GrpNamePattern}

$SG_Details=Get-EC2SecurityGroup -Filter $filter_platform |SELECT GroupId, GroupName

 

$rdpPermission = New-Object Amazon.EC2.Model.IpPermission -Property @{IpProtocol=”tcp”;FromPort=3389;ToPort=3389;UserIdGroupPair=$PublicGroup}

 

$mssqlPermission = New-Object Amazon.EC2.Model.IpPermission -Property @{IpProtocol=”tcp”;FromPort=1433;ToPort=1433;IpRanges=$InfoObject.CidrIp}

$permissionSet = New-Object System.Collections.ArrayList

 

if($Rdp){ [void]$permissionSet.Add($rdpPermission) }

 

if($MsSql){ [void]$permissionSet.Add($mssqlPermission) }

 

if($permissionSet.Count -gt 0)

{

try{

if(!$Revoke){

“Granting to $($SG_Details.GroupName)”

Grant-EC2SecurityGroupIngress -GroupId $SG_Details.GroupId -IpPermissions $permissionSet

}

else{

“Revoking to $($SG_Details.GroupName)”

Revoke-EC2SecurityGroupIngress -GroupId $SG_Details.GroupId -IpPermissions $permissionSet

}

}

catch{

if($Revoke){

Write-Warning “Could not revoke permission to $($SG_Details.GroupName)”

}

else{

Write-Warning “Could not grant permission to $($SG_Details.GroupName)”

}

}

}

 

 

What we are looking at being able to automate Creation/updation of Security Group. Use this script in case you ran into frequent changing of Security Groups.


P.S. This script has been written keeping VPC  in mind, Different parameter usage between VPC and EC2 security groups should be take care of.

 

Credits -“Utkarsh Pandey”

  • April 19, 2016
  • blog

DevOps with Windows – Chocolatey

Conceptually Package manager is well understood space for someone having slightest understanding of how *nix environment get’s managed, but when it comes to windows it was untracked space till recently. This was piece of stack which was ironically missing for so long that once you get hands on you will feel how on earth you were living without it. NuGet and Chocolatey are the two buzz words making lots of noises and deemed as the future for windows server management.

What Is Chocolatey?

Chocolatey builds on top of NuGet packaging format to provide a package management  for Microsoft Windows applications , Chocolatey is  kind of yum or apt-get but for windows. Its CLI based and can be used to decentralize packaging.  It has a central repository located at http://chocolatey.org/.

If you have ever used windows build in provider you probably be aware of the issues it has. It doesn’t really do versioning and seems misfit for upgrading. Any organization looking for long term solution to ensure that latest versions are always installed for them build in package provider may not be the recommended option .Chocolatey takes care of all this with very little effort. In contrast to default provider which has no dependency Chocolatey requires your machine to have Powershell 2.0 & .Net framework 4.0 installed. Installation of packages from Chocolatey is one command line that reaches out to internet and pulls it down. That would be version-able and upgradable; you can specify this version of package and that that’s what gets installed.

Recommended way of Chocolatey installation is by executing PowerShell script.

Chocolatey With AWS

AWS offers windows instances with both their offering; under IAAS you can launch windows instance as EC2, whereas with PAAS you can get that via Elastic beanstalk.

Using Cloud Formation:

Using ‘cfn-init’ AWS Cloud Formation supports the download of files and execution of commands on Windows EC2 instance. Bootstrapping of Windows instance using Cloud Formation is lot simpler than any other ways. We can leverage this offering to install Chocolatey while launching the server using CFT. While doing this using Cloud formation we have to execute PowerShell.exe and provide the install command to that. One thing to be take care of that Chocolatey installer and the packages it installs may modify the machine’s PATH environment variable. This adds complexity since subsequent commands after these installations are executed in the same session, which does not have the updated PATH. To overcome this, we utilize a command file to set the session’s PATH to that of the machine before it executes our command. We will create a command file ‘ewmp.cmd’ to execute a command with the machine’s PATH, and then we will proceed with Chocolatey and any other installation. With below sample we will be installing Chocolatey and then install Firefox with Chocolatey as provider.

“AWS::CloudFormation::Init”: {

“config”: {

“files” : {

“c:/tools/ewmp.cmd” : {

“content”: “@ECHO OFF\nFOR /F \”tokens=3,*\” %%a IN (‘REG QUERY \”HKLM\\System\\CurrentControlSet\\Control\\Session Manager\\Environment\” /v PATH’) DO PATH %%a%%b\n%*”

}

},

“commands” : {

“1-install-chocolatey” : {

“command” : “powershell -NoProfile -ExecutionPolicy unrestricted -Command \”Invoke-Expression ((New-Object Net.WebClient).DownloadString(‘https://chocolatey.org/install.ps1’))\””

},

“2-install-firefox” : {

“command” : “c:\\tools\\ewmp choco install firefox”

}

}

}

}

 

Using AWS Elastic Beanstalk:

AWS Elastic Beanstalk supports the downloading of files and execution of commands on instance creation using container customization. We can leverage this feature to install Chocolatey.

The aforementioned installation can get translated into AWS Elastic Beanstalk config files to enable use of Chocolatey in Elastic Beanstalk. The change while doing using Elastic Beanstalk; we will create YAML .config files inside the .ebextensions folder of our source bundle.

files:

c:/tools/ewmp.cmd:

content: |

@ECHO OFF

FOR /F “tokens=3,*” %%a IN (‘REG QUERY “HKLM\System\CurrentControlSet\Control\Session Manager\Environment” /v PATH’) DO PATH %%a%%b

%*

commands:

1-install-chocolatey:

command: powershell -NoProfile -ExecutionPolicy unrestricted -Command “Invoke-Expression ((New-Object Net.WebClient).DownloadString(‘https://chocolatey.org/install.ps1’))”

2-install-firefox:

command: c:\tools\ewmp choco install firefox

 

Above will work in the same way as cloud formation sample did , it will Create a command file ‘ewmp.cmd’ to execute a command with the machine’s PATH before installing Chocolatey and Firefox.

 

P.S. Chocolatey can best be used as package provider for puppet on windows. Puppet offers great support in promotion and execution of Chocolatey on Windows. 

 

 

Credits – Utkarsh Pandey

 

  • April 19, 2016
  • blog

Top Health IT Issues You Should Be Aware Of

Information Technology (IT) as a major role in refining the facilities of healthcare industry to improve patient care and organize vast quantity of health related data. Over several years, the healthcare across country has seen remarkable growth with the help of IT. So both the public and private healthcare sectors are making use of IT to meet their new requirements and standards. Though IT is playing an important role to improve excellence in patient care, increase efficiency and reduce cost, there are certain Health It issues that you should to be aware of and should fix them:

Database

New database and related tools are needed to manage huge amount of data and improve patient care. So, using non-relational database will help to manage and make proper use of vast amount of healthcare data. This database type is perfect for information that is structured easily but they can’t handle unstructured data (like records, clinical notes, etc.). But relational databases (like electronic health records (EHR)) organize data into tables or rows or force information into predefined groups. However, with non-relational databases it is easy to analyze different data forms and avoid rigid structure.

Mobile Healthcare and Data Security

With change in financial incentives and growth of mobile healthcare technology, the patient care would shift to the consumer. Thus, providing care is easy from anywhere and anytime with mobility. Also to reduce the money spent on health plans, additional tools are allied for wellness and disease management programs. However, cyber security issues are the biggest threat. As breaching the data would cost huge financial loss. It is necessary to take action to prevent breaches as this is a major issue.

With increase in mobility of healthcare, it is must to introduce a mobile/BYOD that would help avoid data breaching and privacy intrusion.

Health Information Exchange (HIE)

The HIE will help sharing of healthcare data between healthcare organizations. Different concerns related to healthcare policy and standard should be analyzed before implementing such exchanges as sensitive data is at risk.

Wireless Network and Telemedicine

Wireless networking is mandatory for the employees of healthcare industry to avail the medical facilities. To transfer the old health IT services to adopt wireless access could be an expensive and challenging option due to structural limitations. Also the wireless issue continues to be an obstacle for telemedicine adoption. The varying state policies on telemedicine use and reimbursement continue to restrict this emerging technique.

Data analysis

It owns a major role in assisting, treating and preventing illness and providing quality care to people. To implement a data analysis system which offers secure data storage and easy access would be an expensive and robust task.

Cloud System

The cloud system is answerable to many questions with respect to data ownership, security and encryption. To fix the cloud related issues, some providers are experimenting with cloud-based EHR systems while others build their own private cloud.

The necessity and requirement of health IT is increasing every day. Though health IT has become a major phenomenon, we should always remember that challenges would continue to intrude as they progress. So be aware and keep yourself updated with the top health IT issues and tackle them.

Related post from 8KMiles

How Cloud Computing Can Address Healthcare Industry Challenges

How pharmaceuticals are securely embracing the cloud

5 Reasons Why Pharmaceutical Company Needs to Migrate to the Cloud

8K Miles Tweet Chat 2: Azure

If you missed our latest Twitter chat on Azure or wish to once again go through the chat, this is the right place! Here’s a recap on what happened during the 12th April Tweet chat, as a compilation of all the questions asked and answers as given by the tweet chat participants. The official tweet chat handle of 8K Miles being @8KMilesChat shared frequently asked questions (FAQs) related to Azure and here’s how they were answered.

1

2

3

4

5

6

twitter chat

7-2 7-3

7-4

 

8-18-28-38-4

 

9

10

 

We received clear answers to every question asked and it was an informative chat on Azure. For more such tweet chats on cloud industry follow our Twitter handle @8KMiles.

The active participants during the tweet chat were cloud experts Utkarsh Pandey and Harish CP. Here’s a small brief on their expertise:

Utkarsh Pandey

Utkarsh, is a Solutions Architect who in his current role as AWS & Azure Certified solution architect holds the responsibility of cloud development services.

HarishCP

HarishCP, is a Cloud Engineer. He works in the Cloud Infrastructure team helping customers in Infrastructure management and migration.

Powershell: Automating AWS Security Groups

Powershell: Automating AWS Security Groups

To provision and manage EC2-Instances in AWS cloud that comply with industry standards and regulations, Individuals administrating that should understand the security mechanisms within AWS framework—both those that are automatic and those that require configuration.Let’s take a look at Security Group which falls under the latter category.

As there is no “Absolute Security Group” which can be plugged in to satisfy the universal need, we should always be open for its modification.Automating so via Powershell will provide predictable/consistent results.

What Is Security Group?

Every VM created through AWS Management Console (or via scripts) can have association with one or multiple Security Groups (in case of VPC it can be up to 5). By default all the inbound and out bound traffic flow at instance level is blocked from elsewhere. We should automate the infrastructure to open only the ports satisfying the customer need. This implies that we should add rules to each Security Group for ingress/ egress as per customer requirement.For more details have a look at AWS Security Group

It is duly important to allow traffic only from valid source IP addresses; this will substantially prune security attack surface, use of 0.0.0.0/0 as IP range makes things vulnerable for sniffing or tampering of infrastructure. Traffic between VMs should always traverses through Security Groups, we can achieve this by allowing initiators Security Group- ID as source.

Automation Script

I have kept this as a single block ,if one wishes they can create a function out of it. few things worth considering :

  • Execution of this script will only materialize given working pair of Secret Key & Access Key
  • This script make use of filtering functionality, whereby it expect end user to provide some Name-Pattern ,selection of Security Group is driven by aforementioned pattern
  • To facilitate the whole operation you have to provide certain parameters i.e.[IpProtocol , FromPort , ToPort , Source]
  • Source parameter can be interpreted in two ways, you can either provide IpRanges in CIDR block format or choose another Security Group as source in the from of UserIdGroupPair

<#

.SYNOPSIS

Simple script to safely assign/revoke Ingress Rules from VPC Security Group .

 

.DESCRIPTION

Script first checks to see what are the rules has beein specified for update,if already assigned will do no harm.

If assginement is successful, same can be verified at AWS console.

 

NOTE:  Script must be updated to include proper pattern, security credentials.

#>

# Update the following lines, as needed:

Param(

[string]$AccessKeyID=”**********”,

[string]$SecretAccessKeyID=”********”,

[string]$Region=”us-east-1″,

[string]$GrpNamePattern=”*vpc-sg-pup_winC*”,

[string]$GroupId=”sg-xxxxxxxx”,

[string]$CidrIp=”0.0.0.0/0″,

[switch]$SetAws=$true,

[switch]$Revoke,

[switch]$Rdp=$true,

[switch]$MsSql=$true

)

$InfoObject = New-Object PSObject -Property @{

AccessKey = $AccessKeyID

SecretKey = $SecretAccessKeyID

Region=$Region

GrpNamePattern = $GrpNamePattern

GroupId=$GroupId

CidrIp=$CidrIp

}

if($SetAws)

{

Set-AWSCredentials -AccessKey $InfoObject.AccessKey  -SecretKey $InfoObject.SecretKey

Set-DefaultAWSRegion -Region $region

}

$PublicGroup = New-Object Amazon.EC2.Model.UserIdGroupPair

$PublicGroup.GroupId= $InfoObject.GroupId

$filter_platform = New-Object Amazon.EC2.Model.Filter -Property @{Name = “group-name”; Values = $InfoObject.GrpNamePattern}

$SG_Details=Get-EC2SecurityGroup -Filter $filter_platform |SELECT GroupId, GroupName

$rdpPermission = New-Object Amazon.EC2.Model.IpPermission -Property @{IpProtocol=”tcp”;FromPort=3389;ToPort=3389;UserIdGroupPair=$PublicGroup}

$mssqlPermission = New-Object Amazon.EC2.Model.IpPermission -Property @{IpProtocol=”tcp”;FromPort=1433;ToPort=1433;IpRanges=$InfoObject.CidrIp}

$permissionSet = New-Object System.Collections.ArrayList

if($Rdp){ [void]$permissionSet.Add($rdpPermission) }

if($MsSql){ [void]$permissionSet.Add($mssqlPermission) }

if($permissionSet.Count -gt 0)

{

try{

if(!$Revoke){

“Granting to $($SG_Details.GroupName)”

Grant-EC2SecurityGroupIngress -GroupId $SG_Details.GroupId -IpPermissions $permissionSet

}

else{

“Revoking to $($SG_Details.GroupName)”

Revoke-EC2SecurityGroupIngress -GroupId $SG_Details.GroupId -IpPermissions $permissionSet

}

}

catch{

if($Revoke){

Write-Warning “Could not revoke permission to $($SG_Details.GroupName)”

}

else{

Write-Warning “Could not grant permission to $($SG_Details.GroupName)”

}

}

}

what we are looking at being able to automate Creation/updation of Security Group.Use this script in case you ran into frequent changing of Security Groups.

 

Credits -Uthkarsh Pandey

  • April 13, 2016
  • blog

Puppet – An Introduction

Puppet – An Introduction

Most common issue while building and maintaining large infrastructure has always been wastage of time. Amount of redundant work performed by each member within team is significant. The idea of automatically configuring and deploying infrastructures has evolved out of a wider need to address this particular problem.

Puppet and Chef are few among the many configuration management packages available. They offer a framework for describing your application/server configuration in a text-based format. Instead of manually installing IIS on each of your web servers, you can instead write a configuration file which says “all web servers must have IIS installed”.

What Is Puppet ?

Puppet is Ruby -based configuration management software, and it can run in either client-server or stand-alone mode. It can be used to manage configuration on UNIX (including OS X), Linux, and Microsoft Windows platforms. It is designed to interact with your hosts in continuous fashion,Unlike other provisioning tools that build your hosts and leave them on their own.

You define a “Desired State” for every node (agents) on puppet master. If agent node doesn’t resemble desired state, in puppet terms “drift” has occurred. Actual decision on how your machine is suppose to look is done by the master, whereas agents only provides data about itself and then responsible for actually applying those decisions. By default each agent will contact master every 30 min, which can be customized. The way this entire process work can be summed with this workflow.

PL_dataflow_notitle

  1. Each nodes sends its current information (current state) in the form of facts.
  2. Puppet master will use these facts and compile a catalog about desired state of that agent, and send it back to agent.
  3. Agent will enforce the configuration as specified in catalog, and send the report back to master to indicate the success/failure.
  4. Puppet Master will generate the detailed report which can be feed to any third party tool for monitoring.

Credits -Uthkarsh pandey

  • April 13, 2016
  • blog

Meet 8K Miles Cloud Experts at Bio-IT World Conference & Expo ‘16

The Annual Bio-IT World Conference & Expo is around the corner! The Cambridge Healthtech’s 2016 Bio-IT World Conference and Expo is happening at Seaport World Trade Centre, Boston, MA and 8K Miles will be attending and presenting in the event. The three day spanning meet from April 5th -7th includes 13 parallel conference tracks and 16 pre-conference workshops.

 The Bio-IT World Conference & Expo continues to be a vibrant event every year that unites 3,000+ life sciences, pharmaceutical, clinical, healthcare, and IT professionals from more than 30 countries. At the conference look forward to compelling talks, including best practice case studies and joint partner presentations, which will feature over 260 of your fellow industry and academic colleagues discussing themes of big data, smart data, cloud computing, trends in IT infrastructure, genomics technologies, high-performance computing, data analytics, open source and precision medicine, from the research realm to the clinical arena.

When it comes to the Cloud Healthcare, Pharmaceutical and Life Sciences have special needs. 8K Miles makes it stress-free for your organization to embrace the Cloud and reap all the benefit the Cloud offers while at the same time meeting your security and compliance needs. Stop by booth #128 at the event to meet our 8K Miles, Director of Sales, Tom Crowley who is a versatile; goal oriented sales, business development and marketing professional with 20+ years of wide variety of experience and accomplishments in the information security industry.

Also, at the event on Wednesday, April 6th from 10:20-10:40am two of our 8K Miles speakers Sudish Mogli, Vice President, Engineering and Saravana Sundar Selvatharasu, AVP, Life Sciences will be presenting on Architecting your GxP Cloud for Transformation and Innovation. They will be sharing solutions and case studies for designing and operating on the cloud for a GxP Environment, by utilizing a set of frameworks that encompasses Operations, Automation, Security and Analytics.

We are just a tweet away, to schedule a one-on-one meeting with us, tweet to @8KMiles! We look forward to meeting you at the event!