Like many great debates that run over centuries, comparison of SAS with other programming languages, discussing their pros and cons, is a common and continuous process. It is a point of consideration between analytics for long now, on the language of choice, SAS or Python or R.
The technological advancements are so dynamic, that this debate can take place every couple of years and get answers that can sway on either sides. In this blog, we will discuss the global trends and the ecosystem of SAS, by itself of what SAS offers, and about the advantages of knowing SAS as a language.
Commercial analytics has always seen a strong presence of SAS as SAS offers a huge collection of statistical functions. SAS has a good support system to aid quick adaptation of the language, SAS provides excellent technical support. The only areas that work against SAS is the cost, it is the most expensive option and it lags in terms of latest statistical functions, when compared with other languages like R and Python.
To make an informed decision about SAS being the best programming language, let’s understand all the attributes of the language.
In terms of Convenience and Price, lets accept that SAS is a commercial software, hence it is expensive and not very affordable for most of the professionals. So unless you are associated with an establish institution which has invested in SAS, it might be difficult for you to lay your hands on SAS.
SAS is comparatively Easy to Learn, precisely for analysts with SQL knowledge. Like mentioned above it has a good support system, with tutorials and comprehensive documentation, but they are costly when compared to other programming languages, which are also known for some amount of simplicity. However, the GUI interface of SAS is very stable.
Data Handling Capabilities of SAS used to be a USP of SAS till a while ago. But on recent comparisons specially with R and Python, it can be easily said that this is no longer the case.
The Graphical Capabilities of SAS are good and can be considered to be only functional, any customisations require great understanding of the SAS Graph Package, and even then customizing on SAS is difficult. A little disadvantage when compared with other languages.
Developments in Tools of SAS are more or less at par with other languages. Other open languages have new version roll outs, on open contributions, hence the chances of error are possible. SAS also releases updates and they are well tested.
SAS is still considered the market leader in the job scenario in most established corporations. R and Python along with other programming languages becomes a preferred option for new companies looking for cost efficiency.
To conclude, yes it looks like the market is opening up more to other programming languages as well. So it completely depends on your conditions. If you are a fresher, it is recommended that you learn SAS as a first language, purely because it holds a high market share of jobs and is fairly easy to learn. If you are a veteran in the analytics world, then diversifying and adapting a new programming language is recommended. After all, knowing more than one language only adds to the flexibility and opens that many opportunities for you.
Category: Analytics
Data Science – Things You Should Know
Data or Big Data is the new buzz word. No matter where you are from, across all lines of work, you cannot deny coming across Data Science. You might not understand data science and all that comes along with it, but you cannot deny that the impact of Data Science is enormous, and it mostly changes things around it for good.
Where is this data coming from and why now?
Now a day’s over six billion devices are connected to the internet, all the applications and devices that are connected have users and their movement on the internet generates data, it is estimated more than 2.5 million terabytes of data is created in almost 24 hours. That is humongous!! And as technology keeps getting added into our daily lives, these volumes are going to increase exponentially.
This is the fascinating piece of information, especially if you are from the field of IT. More so because recently there have been massive layoff drives in India within the tech companies. Like the law of nature goes, it is time for IT professionals to evolve or perish, and what better time than now to dive into the world of data science.
So if Data Science is of interest to you, or for a quick understanding of the subject, read along.
Data Science is coined as the ‘Sexist job of the 21st century’ by the Harvard Business Review. Since there are such huge volumes of data that is created on a daily basis, you need the skills from the data science field to get insights from this information and set things on track.
Data science is used to Optimize Performance. So if you use the GPS or make online purchases, do you notice how the next time you try to move online, you see the internet throwing the right recommendations? The data that you are generating online come back to you as optimized performance, due to data science insights.
If you wish to progress in this field, there are certain skill sets you would need, like Statistics, Knowledge in Data Science Tools, a Business Acumen, excellent Communication Skills, Inquisitive and Analytical Mindset, skills to successfully work on data, ability to find patterns and extract information.
Good news is that you don’t necessarily need to have a degree like a PH. D, however certification in the fundamentals of analytics along with the required technical skills should be a good starting point.
Since Data Science is a vast landscape, possessing all working knowledge about it is not possible, knowledge in globally recognised technologies like SAS, R, Python, SQL Database and Hadoop will make it easier for you to make a switch or enter the field of data science.
Data science requires niche skills and deep understanding of analytics.
Analytics can be broadly classified into three categories,
- Descriptive Analytics – as the name suggests, is describing the pieces of information that are uncovered from the data pool, for E.g. while analysing how many people are interested in pursuing a career in data science, descriptive analysis would say 35% people from IT, 27% graduates from Statistics etc.., are interested.
- Predictive Analytics – again as the name suggests, forecasting about the events that could happen based on historical data. So essentially with predictive analysis, one could estimate how many people from IT will enter the field, by studying the past turnout.
- Prescriptive Analytics – gives you solutions or prescriptions in rectifying actions for desired results.
Machine Learning is another upcoming vertical in data science, to simply put it, it’s the ability of machines to learn with minimal programming efforts by the method of algorithms.
Internet Of Things (IoT) is another technology which is contributing significantly to the field of data science. It is basically an ecosystem of devices which are connected with each other via the internet. IOT is all about data generation and data science is all about data analysis.
Learning data science is not sufficient but you also have to practice it. Look for courses that offer case studies and projects with real-time data sets to work on. You will need this upper edge to have a rewarding career in data science.
Predictive Analytics – Explanation in a Simple Way
Predictive Analytics – Explanation in a Simple Way
Let’s say you own a company, to increase revenues and have a better market positioning for your product, what would your approach be? Would you like to predict and formulate ways through which you could acquire more clients, or would you prefer to predict which existing customers would attrite and if given a chance what could you surely do to retain them? What if you had access to the best marketing strategy accurately targeting the potential audience, or if you knew the most accurate prediction in cross-selling or upselling opportunity with your customers, what if your company could predict almost with certainty what each existing customer wants and you could device plans to address those needs. Sounds exciting?
Most businesses are customer-centric in nature, directly or indirectly, and this kind of information could be of colossal advantage for the growth and sustenance of most organisations. Currently, we have entered the most established phase of empowered customer, where to exist in the long run, most organisations are ferociously finding ways to connect with the customer. Organisations acknowledge that to sustain and grow they need have more evolved interactions with their customers, and that is easy when they have a strong knowledge base with the customers. This is Predictive Analytics for you.
Predictive analytics is viewed by most people as a branch of big data, although it is partially true, it is vast enough to be considered as a separate phenomenon.
Predictive analysis allows a more proactive approach, with a foresight on anticipating and predicting outcomes and behaviours, based upon data and not only assumptions. What’s exciting is that it not only predicts, but also suggests actions derived from the findings, and provide decision options.
Over a period of time, after cleaning the volume and variety of data,
A project is designed and Data Mining for Predictive Analytics prepares data from various sources on which Reporting Analysis is done, which gives insights on what happened and why did it happen.
The next stage is Monitoring, getting real-time understanding of what is happening now.
Predictive Analytics, where Statistical Analysis is performed in getting insights into the future, keeping in line with the hypothesis.
Predictive Modelling Deployment provides possibilities of what could happen derived from historical facts and behaviors. Taking into account an action plan is suggested to get the desired output.
Model Monitoring ensures that the models are managed and monitored to review the performance and guarantee that it provides decisions as expected.
Predictive Analytics and its application have a great scope across a variety of industries
Customer Centric Businesses apply predictive analytics to achieve objectives in Marketing campaigns, sales, and across the customer lifecycle, from acquisition to retention.
Health Care applies predictive analysis to gauge the risk factors of patients before they develop certain conditions.
Fraud Detection applies predictive analysis to predict inaccurate credit applications.
Marketing applies predictive analytics to identify the most effective combination of product versions, marketing material, communication channel and timing that can be used to identify and target a given customer.
There is a lot more to the world of predictive analytics, it is no longer in the nascent stage and is fast growing. To acquire a basic understanding of the functionalities of predictive analysis would be a very wise thing to do.
The Impact of Machine Learning on Web Applications
As a type of Artificial Intelligence, Machine Learning is changing and reshaping, how people are doing business. Most companies are investing in these modules, purely because of the magnitude of the impact, it has on business. Machine learning is a big deal and is already impacting our lives in ways we do not consciously know.
Machine learning, as commonly known, uses algorithms to make computers learn on their own, without having to program them clearly at every step. Now, this automated analytical model helps find hidden insights to the new data, which might not be found, if the analysis was done with a traditional approach. For this reason, machine learning is creating a lot of buzzes these days in software development. Some also predict that it will transform the development process of a web application.
Read on to learn the impact of Machine Learning on web development
Data Mining
Data mining techniques are used by enterprises to produce new information from the colossal pool of data. Web mining as a data mining technique is used by many websites to discover patterns and insights from the available data. Machine learning also like data mining detects patterns, however, machine learning has the ability to automatically program actions based on detected patterns.
Comprehend Customer Behavior
Machine learning capabilities are also applied on web applications to better understand customer behavior and to aptly engage the customer. An e-commerce website usually understands the products for specific customers based on their searches and can lead to better conversion related to a product. It can also use algorithms to better understand the features desired by the customer. Machine learning algorithms can also be used to communicate with the customer, analysing their queries and perhaps even connecting them with the customer service team, all in the name of enhancing customer experience.
Modified information
Social networking sites use this feature of personalised content for each user. Facebook most popularly uses, the machine learning algorithm to get personalised newsfeed for each user. Facebook essentially uses a combination of predictive analysis and statistical analysis to detect patterns, based on the content that the user reads and likes. Hence while developing web applications programmers can embed the same technology and deliver personalised content to users based on their choice and preferences.
Expedite Product detection
Large companies with online portals are developing machine learning algorithms to deliver smart search for individual users. These algorithms will assist the user to find the relevant products faster.
Mitigating Security Threats
A number of security firms already use machine learning technology like logistic regression which identifies mischievous websites amongst many. Similarly, some enterprises also use classification algorithms to identify phishing websites, based on criterion like domain identity, data encryption techniques etc.., this will make it easier for programmers to mitigate their web applications from evolving security threats in the future.
So overall, It is not only changing the way web applications are being developed but, the way they will be developed in the future. All these applications are aimed at and will enhance customer experience. Web programmers also have the independence of combining multiple machine learning API’s to further enrich the customer experience by offering faster and smoothers applications.
Benefits of Learning Python Over Other Programming Languages
The idea of the creation of Python originated sometime in 1989, specially to overcome the shortcomings of the ABC language. Hence Python was created incorporating all the good features of the ABC language and adding new desired features, namely extensibility and extension handling. There after many versions of the language was created with newer and better upgradations to the already existing features. Python today has multiple implementations that work in the native languages they are written in like, Java, C, etc…, they also have the capability of interacting with other languages by the use of modules, and most of these modules are open source.
Python is generally used across a variety of channels, for example, like in, GUI based desktop applications, graphic design and image processing applications, games, web applications, business and enterprise applications, also language development and prototyping, so the scope is vast.
Python is always considered as different and better from other languages purely because, it is very user-friendly, especially if you are starting up, the syntax of Python is very simple and readable, it is very comprehensive. It has nice built-ins, a good availability of libraries, amongst other benefits. It is also widely used for developing many web products which are popular, like YouTube, Google, Yahoo! Map etc…,
Python is generally compared with other interpreted languages like Java, JavaScript, C++, Perl, Smalltalk.
Read on for a quick comparison of Python with these languages
Python v/s Java
Python programs runs slower than Java, but they take less time to develop. The codes in Python are at least 3.-5 times shorter than Java, while the Java codes are longer. Another plus is that Python is dynamically typed while Java is not.
Python v/s JavaScript
Python and JavaScript are almost equivalent as an ‘Object Based’ programing language. The major difference is in the fact that, Python can be used as a scripting language as well as a programing language, with the ability to write much larger programs and much better code reuse through true object-oriented programming, while JavaScript can only be used as a scripting language.
Python v/s C++
If Python codes are 3-5 times shorter than Java, with C++ they are at least 5-10 times shorter. Python is a pure OOPS programming language as opposed to C++. Python wins as a ‘Glue’ language which can be used to combine components written in C++.
Python v/v Perl
Python and Perl share the same background and to an extent also show similar features, however, the key difference is in their working philosophy. Perl supports common application-oriented tasks, like file scanning, report generating and built-in regular expression features. Python highlights support for object-oriented programming and common programming methodologies, which assist programmers to write readable and hence maintainable programming codes. Python is not overly cryptic and shows applicability which is beyond the niche capability of Perl.
Python v/s Smalltalk
Although Smalltalk is an object-oriented language, it lacks the dynamic building and dynamic binding that Python offers. Python boosts of a vast library, with more features, Smalltalk has a refined library. Python has a separate development and environment distribution of code while Smalltalk follows a monolithic code.
It is important to understand the above comparisons are based on only the language issues. While choosing a programming language there are many factors that need to be considered, like cost, availability, training, prior investment and above all the ease or emotional attachment of the programmer.
Although to sum up, clearly Python is considered popular amongst most programmers as it is easy to adapt compared to other languages, and there is an available wide-ranging collection of data science libraries.
Learn Web Scrapping with Python
The outburst of information on the internet is a boon for data enthusiasts. This available data is rich in variety and quantity, much like a secret box of information, waiting to be discovered and put to use. Say for example you need to take a vacation, you can scrap a few travel websites, imagine the possibilities, you can pick up travel recommendations of places to visit, popular places to eat, and read all positive feedback from previous visitors. These are a few options, but the list is endless.
How do you extract this information from the internet, is there a fixed way to get information, concrete steps to follow? Not really, there is no fixed methodology. The internet has a lot of unstructured and noisy data, to make sense of this overload of information, you need to use Web Scrapping. It is believed that almost any form of data on the internet can be scrapped, and there are different kinds of web scrapping techniques, each available to tackle different scenarios.

Why Python? as is common knowledge to all, Python is an open source programming language, hence you will find many libraries to perform one function. It does not mean that you will need to learn every library, but you will need to know how to put in a request, to communicate effectively with the website.
Here are 5 Python Web Scrapping Libraries you can use
- Requests – It is a simple and powerful HTTP library; you can use it to access web pages. It can access API, post to forms and much more.
- Beautiful Soup 4 (BS4)– It is a library that can use different parsers. A Parser is essentially a program that is used to extract information from XML or HTML documents. It can automatically detect encodings, which means you can manage HTML documents with special characters. It has the ability to navigate a parsed document easily, thus making it quick and easy to build a common application.

- Lxml – It has great performance and production quality. Initially, it was believed that if you need speed then you should use lxml, and for managing messy documents you should use BeautifulSoup, however that is no longer true, it’s vice-versa now, BeautifulSoup can also support lxml parser. Therefore, it is recommended that you try both and settle on the one convenient to you.
- Selenium – So Requests is generally used to scarp a website, however, there are some sites that us JavaScript for its content. These sites need something more powerful, Selenium is a tool which can automate browsers, and it also has Python bindings for controlling it right from your application. All this makes it ideal to integrate it with your chosen parsing library.
- Scrapy –it can be considered as a complete web scraping framework. It can be used to manage requests, store user sessions, follow redirects and manage output pipelines. What is phenomenal is that you can reuse your crawler, scale it by swapping Python web scraping libraries, like for example, use Selenium to scrap dynamic web pages, all while managing complex data pipelines.
To recap, you can choose from, Requests and Selenium to scrap HTML and XML from web pages, and you can use BeautifulSoup and lxml to parse into meaningful data and Scrappy to manage huge requirements and if you need to, build a web crawler.
Planning of Making a Career in IT? Think of Predictive Analytics Program
If you are at a junction, deliberating future options and fields to pursue your career, specially, if you plan to advance in the field of IT, then read on to understand some approaches that could come across as life altering for you.As the recent trends suggest, to be of relevance in the field of IT over the coming years, it is imperative that you adopt a skill that adds value to the organisation and your role, more so in the times of the Big Data boom.
Predictive analytics is one such technique that is applied within most organisations and is quickly gaining popularity due to the positive business impact it creates.
Reporting, Optimizing and Predicting are three things that you can most often do with your data. hence at the essence of it, predictive analysis is not something new or of great complication. But, ‘Do you have a method through which you can capture and analyse data from the future?’ No, however, there are ways you can predict the future by using the data from the past.
Predictive Analysis thus is an organised technology with a scientific approach, which is used to make predictions about the unknown future events. Intelligent approaches like statistical algorithms, data mining statistics, modelling, machine learning etc…, are applied to analyse past and current data, to make predictions about the future.
In fact, the need for predictive analysis emerged from the desire to turn already available raw data into getting informative insights that can be used not only to understand the past patterns, but based on these patterns, to develop a model with the ability to predict future outcomes.
Predictive analytics can be adopted by any organisation with a defined business goal, like that of reducing risks, increasing productivity, and increasing revenue.
Some examples are,
Banks and Financial Services, to measure the likelihood of fraud, measure credit risk, among others.
Retail and E-commerce, to make inventory of products, plan promotional events,
Health Insurance, besides fraud to identify patients at risk for chronic illness,
Manufacturing to reduce quality and production issues, service distribution, optimize resources,
Government Sectors for campaigning, to better understand human behavior,
these are just a few areas amongst others that benefit from the right use of predictive analysis technique.
In recent times all businesses across industries understand the significance of predictive analytics, however, after a little research it is understood that a staggering 53% business feel there is a huge gap between, ‘what needs to be done?’ and ‘what is currently done?’, due to lack of accurately skilled and trained resources, the results derived from the analysis might not give very accurate and valuable insight. And interestingly 89% businesses understand the value of PI and feel the need to have a dedicated staff to perform Predictive Analysis, giving it a status of a separate team and not an extension of BI. Hence there is a sudden spike for skilled staff in the domain of big data with appropriate predictive analysis skill set.
Imarticus Learning offers a well-spanned program on Predictive Analytics, the course offers a comprehensive hands-on understanding of predictive analytics using SAS, the market leader in business analytics.
The course covers essential skills like SAS programming basics, Hypothesis Testing, Predictive Modelling Techniques, Regression Analysis, Multivariate Analysis and Forecasting.
The course teaches from live case studies to real-life business problems, offering an industry aligned curriculum, delivered by industry expert faculty, with a combination of self-paced online and classroom training methods, and offers Career Assistance Services, thus making you job ready from day one, helping you land your dream job.
To commence in the field of Big Data analytics choose the best, and learn from the best, explore your options with Imarticus Learning now!
Growing Need for Hadoop
In this blog, we will share growing need for Hadoop in the coming decade.
Almost every data processing technology that exists today is on Hadoop, as it works like a storage layer. The size of data is increasing exponentially and so the need to store that data is also increasing by leaps and bounds. It is quite clear that Hadoop and other big data technologies have quite a lot of scope over the coming years.
Hadoop has its USP in powerful features, it works as a base for every technology, it is fault tolerant, it is a resource manager, hence there isn’t any other immediate technology which can replace Hadoop, although it is important to note there is good competition, however Hadoop is leading the way at present, and in the near future.
It is recommended that software engineers with the knowledge of Java should be considered training in Hadoop, as it will further equip them in managing big data with ease. They will be able to extract valuable insights with ease from the existing data.
There are many additional benefits for one to consider, upgrading their skills in Hadoop.
With Hadoop skills as your stepping stone, you can choose how to stir your career with the variety of roles available, based on your interest and expertise.
As a Hadoop Developer, one is responsible for coding and development of all Hadoop related applications, the prerequisite to this role is the knowledge of core Java, Databases, and scripting languages.
Hadoop Architect, is another option, more towards planning and designing the system architectures, responsible for the development of Hadoop applications and its deployment.
A Hadoop Tester, has the responsibility to assure the smooth functioning of various scenarios, to conduct a quality of checks on various scenarios, and remove bugs that could cause hindrances in the proper functioning of the application.
One can also consider the role of a Hadoop Administrator; this role draws its parallel from that of a system administrator in the universe of Hadoop. Key tasks revolve around, maintenance, back up, recovery, and setting up of Hadoop clusters.
The knowledge of Hadoop, will be an asset if you wish to consider advancing your career as a Big Data Scientist, success in this role would require a combination of technical skills of a software programmer, and an analytical mindset of a scientist, to analyse colossal amounts of data and make intelligent applications based decisions, which are beneficial to the clients.
Another reason to consider a career in Hadoop is for its remuneration. Based on your skill sets additional qualifications, experience and the companies hiring you, an average salary scale for individuals from the field of Hadoop in big data analytics, vary from anywhere between, 3 lakhs to 18lakhs and above.
Lastly, it is established that there is a growing need for Hadoop in the coming years, many applications and skill sets require the understanding of Hadoop, and majorly based on these facts there are many big names hiring for Hadoop experts, like Amazon, eBay, Yahoo, Microsoft, Netflix, LinkedIn, Dell. Oracle, Cloudera, Hortonwoks, etc…,
It is a big sea of opportunity out there, for fresher’s and experienced professionals alike, as Hadoop is here to stay, and professionals who are experts in these skills have a bright future, companies are hiring expensive talent, so upgrade your skill set and become a part of the tide!
IT Professionals Must Adopt Machine Learning Skills
People in the IT profession are not seeing the light of the sun shine upon them, as it used to over the last decade. Some IT giants in recent times have announced layoff, of close to 2% of its workforce.
IT profession falling under the low skilled IT and BPO positions, are entering a grim future, with the possibility of their jobs becoming redundant by as early as 2020. Most of these low skilled IT jobs will be lost to automation support and back office support. The high skilled jobs can be considered as secured for the time being, however they will need to add value to their roles by upgrading their skill sets.
It’s the age of digital disruption, while it is true that with automation, there will indeed be a creation of new job roles and responsibilities. However, the growth is predicted at about 14%, it will still take the net hit of about 4.8 lakh job losses over the next years.
And the final nail on the coffin was the US immigration rules getting more stringent and averting level1, low skilled IT roles like software engineers, from giving them the H1B visas.
All of the above has proven a potential loss of opportunities for IT professionals.
So what do the software engineers and network admins do to sustain in the job market? Simple, have a plan B in place, do not let the sword swing close to you, avert it by venturing out of your comfort zone and adopting a new valuable skill, that will in time make you valuable, change and upscale to survive.
You need to make yourself relevant to the changing demand rather than being stuck with the traditional roles. A recent study on WEF predicts there will be a high demand in processing data analytics, cloud computing, mobile internet and IOT (internet of things). These sectors will see an exponential growth in the global market, thus giving a rise of employment in these areas in the coming years.
Being an IT professional you might have sufficient technical experience, along with other skills, so your comfort and literacy with technology is here to stay, it is all about how you squeeze your way into relevant roles.
Riding on the wave of big data analytics is a safe bet. A specific field that is generating more buzz these days is Machine Learning.
The good news is, if you plan to make a career in Machine Learning, the prerequisite is to have a research background along with software engineering skills, computer science fundamentals, and programming. One can enrol themselves for data analytics certifications and courses that help in gaining clarity on probability and statistics, data modelling and valuation techniques, and understanding of machine learning algorithms and their applications.
At the end of the day, a machine learning’s deliverable is software. You need to be able to fit the small pieces in the large scheme of events. And a software engineering best practices are invaluable for productivity, collaboration and sustainability.
The global markets are definitely changing, and the demand for machine learning is exponentially on a rise. So If you are a software engineer, now is the time to adapt this skill and develop a mind set to succeed.
Data Analytics – A Big Career Dream
Data Analytics, Big Data, Data Scientist, these are no longer big terms from a far away profession, these words or rather roles are becoming catalysts, impacting the growth of our businesses and enhancing the overall experience we get in doing our daily tasks.
Our online presence is not a matter of choice anymore; we often find ourselves using online portals to shop, connect with a doctor, research, basically from going on a vacation to preparing for motherhood, marriages, and dating, to banking, and even school and college admissions, all of these are done online, we even use social networking to express ourselves, through tweets, posts etc…,
Excessive usage of the internet creates online activity logs that contain humongous amounts of data.
Now imagine the camera’s mounted almost on every corner of the street and satellite based observations like the google map and google earth, they also collect data in large numbers on how people conduct themselves.
This data that is generated is being collected in large numbers around the clock, in real time and historic, this data further needs to be extracted, however, it is easier said than done, data is huge and extraction and explanation of the same cannot be done effortlessly. Most of the data collected is unstructured and not authentic, so you need to be wise to catch the correct characteristics at the right time.
People who can perform this extraction in a functional manner and make sense of it are called, Data Scientist or Data Analysts. The competencies that help them in this task are, sound knowledge of Mathematics, Computer Science, and Statistics.
The job of a data scientist is not only extracting data and analysing it, but to clean the data in such a manner that they can also predict and forecast trends for an assigned business, based on certain hypothesis or conditions. And that is the uniqueness they get to their job, the ability to accurately pre-process data and predict and forecast, sets one data analyst apart from the other.
A career in big data has become a dream choice for most job seekers these days, there is a lot that an organisation can achieve with the right application of data science. Some companies have identified this, and are either training their internal staff on the skills required to perform the job, while others are not yet too open to hire a full time resource. Although that day is not too far when the position of a data analyst will become imperative in every organisation.
If you are planning to enter the data science industry to make a great career in big data, then you need to adapt and acquire certain competencies and expertise in data analytics related tools, in addition to the above mentioned prerequisites. For example, programming languages, like R, and Python, SAS, a working knowledge of Machine Learning, and Predictive analysis. Also a sound knowledge in the industry you plan to work for, e.g., healthcare, or IT, Education etc.., will be an added advantage.
There is a huge gap between the demand and available resources in the field of data science, hence making a career shift in this direction would be wise and also lucrative, recent researchers have suggested that a data scientist earns more than experienced engineers. Clearly, this is a field with huge potential.
Do take up certifications, that will further assist you to springboard yourself in the field of data science.