Process improvement through nudging

As business analysts, we are called in often to look at ways to improve the current process. Measurable improvements desired by the business to justify the process improvement could be in:

  • Quality
  • Reductions in costs
  • Increase in processing per hour

Any process to be improved has a certain amount of dynamic variability to it. From a high level math perspective, the processes are looked at as “dynamic resource allocation” because of the variability factor. By controlling the variability with nudges we can improve the process.

  • NOTE: With the advent of stronger AI, in the future we will see more reliance on AI to advise as to the best way to improve a process and it will be left up to the Business Analyst to help put AI advice in place.

What is “nudging” and how is it used to improve a process?

Nudging is where we don’t force a change of process or add new processes to improve process but instead nudge the behavior of the participants in a the current processes to get the results desired. A current example of this is where financial institutions offer rewards to customers if they go paperless for their statements. Going paperless improves:

  • Percentage of outstanding statements processed per hour as smaller printing backlog.
  • Speed of delivery as they are delivered in hours instead of days.
  • Quality in the sense that the statement does not get delivered to the wrong address, does not get damaged in printing etc.
  • Cost reduction as less mailing costs.

You can see from the 4 bullet points above, that a lot can be achieved by just nudging the customer in the statement process to no longer expect a paper statement.

So the next time you are looking at improving a current set of business processes, ask yourself if you can make improvements by “nudging” the current users of the business process in a direction that would support measurable improvements for less cost than force implementing changes or building solutions that have to manage many variables.

The Data Lake – understanding the concept

June 8, 2019 by · Leave a Comment
Filed under: Business Analyst Skills, Data 

As data capture has grown so have some of the techniques of handling the data. For about 10 years now, the Data Lake has started to appear in the business world as part of the data capture concept.

Originally when I started out, data was distributed all over the place with business analysts having to ask for extracts from various departments to get an overall view of the company. It was time consuming.

Next came the large data warehouse accepting in data from all over the company to a central store. However it could take years to get that data into the data warehouse. At one place I worked, it was a minimum of 2 years to absorb data into the data warehouse. Delay in getting data in was caused by the need to model the data and understand it completely before it could be absorbed. Data modelers would have to work out if new tables were needed and BAs would have to justify the business cost of storing the data. Add onto this that existing reports would be expected to use the data from the data warehouse and these reports would all have to be rebuilt to use the new data structure.

As companies have evolved to produce even more data, the data warehouse wait time was increasing significantly. Waiting for centralized data however did not tie in well with corporate strategy of being able to know what is going on around the company. At this point the Data Lake concept came into being. The Data Lake is basically a collection point for all data from around a company in any type of data structure. Data does not need to be refined to end up in the Data Lake. Good and bad data is collected. Visually the Data Lake term represents departments that generate data as streams that feed the lake.

As the data collects in the Data Lake, eventually some of it will make its way into the enterprise data warehouse based on need and cost justification. By creating a Data Lake approach, it has created a one source of data for people in a company to access. Data scientists can look at what is being captured and see if any of it is of use to what they are trying to analyze.

Pros of Data Lake:

  • Centralized repository of company data which in theory makes it easier to find data.
  • Quick to capture data into as not refined in anyway.
  • Allows the data source departments to focus on supporting their applications / business and not on providing formal data extracts that have to be absorbed by a data warehouse or other team.
  • Don’t have to wait on departmental availability of resources to get access to another department’s data.

Cons of Data Lake:

  • Resources have to be hired to support the collection of data into the data lake and the sharing of it.
  • Failure to get good searchable metadata on the data being store in the Data lake would prevent the data from being discovered at a later date.
  • Resources associated with the original data generation are not part of the Data Lake team which means the personal knowledge on the Data Lake team is limited to non-existent. Data knowledge is totally reliant on the metadata captured at the time the data is stored.
  • Useful and not so useful data is captured as the focus is capturing data.
  • Dependent on cheap storage to justify the large storage costs and the resources to support the physical storage / networks etc.
  • Secure data should not end up in a Data Lake due to risk that it may be exposed.
  • Not for operational reporting where reports have to be generated in 24 hours or less of data being created.

In summary, the Data Lake concept is just a fancy way of saying centralized raw data store created from data provided via different departments in a company. A Data Warehouse can pull data from the Data Lake for storage in the Warehouse at a later date once the need for it to be stored formally has been identified.

What kind of business are you in?

The question “what kind of business are you in?” seems simple enough and is a standard question that businesses ask themselves to stay relevant and not lose sight of their market. However as we know, the answers to simple open questions can end up being complicated. Looking at an example of a wrong answer for this question: railroad company thinks of themselves as a company in the railroad business, not realizing they are in the transportation business. An extreme example of bad decision making was Kodak not realizing they were in the memory / emotion capture business and instead they focused on providing film and print material because it had made them money for over 100 years. By the time they realized what business they were in, it was too late.

You might be wondering what direction I am taking this in. I want you to consider how you would answer this question in relation to your current career as a business analyst.

As a business analysts, I consider we are there to help generate improvement of profit and or reduction of costs for the companies we work at. However most employers (who are actually our customers) don’t see that in our role but instead look upon us to be specific in what we provide them in terms of knowledge and experience. Examples would be:

  • Payment handling
  • Healthcare data processing
  • General data analytics
  • Anti money laundering
  • Utilities
  • Mobile applications development
  • etc..

This narrow role definition by our customers puts us back into the mental mode of thinking that we are in the railroad business and not in transportation. Basically our customers are not going to tell us that they plan to make us obsolete with a new solution to their business needs or that they are losing market share in their industry (leading to job losses). We have to think beyond what we immediately provide to the customer and consider at least two things in our careers.

  1. Industry trends
  2. Tools we use

Industry Trends:

  • Is the Industry that we are working in shrinking or growing in our geographic location of work? Example – think of factories that get closed or corporate mergers either of which would reduce people needed in the industry.
    • To overcome, you would either need to gain experience / knowledge in a new industry or move location to where the work is (if that is an option).
  • Are there current or future disruptions to the way the work is being done in our industry that we need to be aware of? Example – looking at the railroad, the rails, trains and railcars are just a tool used in transportation. Certainly they help the railway business make money but as the railway companies found out in America after the interstate roads were built, new options for transportation by road upset the apple cart. Money invested in trains and railcars was lost because these tools did not work on the road. Basically being only in the railroad business was going to cause a loss of market share, decline in profits and decline in employment opportunities.
    • To overcome, you need to stay aware of advancements in technology / process that could impact your industry and seek knowledge / experience with the new and even considering changing industry if the new will make your industry obsolete and or reduce its market share causing a reduction in employment.

Tools We Use

  • Are the tools required to do your job changing? Example – with the move to more Agile IT work we are expected to have used formal tools for managing user stories, backlogs etc.. Reporting is another area where tools are continually evolving.
    • To overcome, you need to monitor the tools specified in job postings prior to your next job, have a budget set aside for training, get the training and if possible work out how to get experience with the tool/s.

In summary, don’t let your current success with customers blind you to the market. Stay current with what industries are doing (growing or shrinking) and what tools you need to do your job. That way you will continue to help companies improve their profits and reduce their expenses. Plan to budget for time and money to be spent to keep yourself marketable to customers. Be prepared to ditch an industry if the future looks grim. Don’t focus on pure profit, invest in yourself to stay in line with the market otherwise you may become the next Kodak.

3 Generic Certifications that help you get IT BA interviews!

There is no getting past it that the IT BA market has become saturated. It is no longer enough to be someone who has worked as a BA for years as the market is full of that experience. So the question becomes how do you make it to the interview pile instead of the reject pile?

Today I want to focus on 3 generic IT certifications that are not tied to an industry or solution that can help move your resume into the pile to be interviewed.

#3 Certified Business Analyst Professional or equivalent: This one has been around for quite a few years now. If you have been doing BA work as long as I have, it really does not bring much value in terms of knowledge. If you have less than 10 years of experience, this one is good to add onto your resume. However its value has somewhat diminished with Agile development.

Pros:

  • Shows that you have at least been educated as a BA.
  • Great for when you have limited real world experience.

Cons:

  • Has not become a job requirement like A+ certification (for pc repair).
  • BA roles differ from company to company so some companies add more or less weight to the certification.
  • Does not carry as much weight in the Agile development world.

#2 Certified Scrum Master: You can look on this certification as selling yourself to the client as two for the price of one. For the longest time, clients have liked to put their BA’s in the role of backup Project Manager, being a Scrum Master is the new flavor that Agile development has brought to us.

Pros:

  • Shows that you understand Agile development.
  • Makes you more appealing to the client as you can now fill two roles.
  • Could increase your salary as Scrum Masters can make more money than ordinary BAs.

Cons:

  • You may end up doing more Scrum Master work than BA work.
  • Could make your life busy as you juggle two roles.
  • You may not like being a Scrum Master.
  • Only applicable to Agile development. For non Agile, you could look to taking Project Management certifications instead.

#1 Certified Product Owner: You can look on this certification as being the natural career progression of the BA involved with Agile development. Any BA that wants to stay more in the BA world should look to get this certification sooner or later. It shows a client that you understand Agile and that you understand the BA role through the Product Owner viewpoint. With the advent of the Product Owner role, certain tasks normally performed by the BA have moved to the Product Owner and this is why it is not a large step for a BA to move into this role.

Pros:

  • Shows that you understand Agile development.
  • Makes you more appealing to the client as it shows you should be able to represent what the business wants.
  • Could increase your salary as Product Owners are more involved with the money making side of the business.

Cons:

  • More applicable to Agile development but does carry over into other types of development methods.
  • May not pay as well

In summary, if you are wondering how to get more interviews as an Information Technology BA, getting at least one of these generic certifications can help you move forward. What school or method you choose to get these certifications is not as important as actually having a certification that you can add to your resume.

From a long term perspective with these certifications, you will need to decide if you want to go more on the high paying Scrum Master side (which is more like the old Project Management) or look to move into Product Ownership which is the natural next step for Business Analysts.

Business Analysts who want to become Product Owners should know these two things.

As the market changes for business analysts and more consider the move into the product owner world, the question becomes, what is the difference between the role? Product owners can sometimes just be business analysts with a new job title and in other cases they are really product owners with full authority to make decisions.

Agile development has driven the growth of the product owner role. No longer do business partners have to wait months for development to implement new features / functions, instead they can be delivered in weeks. Since business partners usually have to run the business they don’t have time to spend on agile work so they delegate business representation to the product owner.

Now, let us consider two of the key differences in the product owner role vs business analyst:

  1. Industry knowledge – with the traditional BA role, there is usually time to get up to speed in the industry being worked in (Retail, Utility, Finance, Health, Transportation etc..) as the requirements are gathered. This means that having industry knowledge is not a deal breaker to being hired. In the product owner world, you had better know the industry as decisions have to be made quickly to keep the development moving. For example glass devices are not allowed in food processing plants, so developing a product solution that uses cell phone applications would be a bad decision for any industry that goes inside food processing plants because of the glass touch screen.
  2. Metrics / research – product owners need to make decisions on the priority of features / functions to be developed. As a product owner, you need to know how to justify the decision based on real world facts. This requires an understanding of the research options / data available and metrics desired in new development. Think Google Analytics, combined with any restrictions on data that can be collected / solutions that can be delivered. Business analysts on the other hand normally get this information and direction from their business partners.

How to get the skills needed to be a product owner?

  • Industry knowledge can be gained by either working as a business analyst in the industry for a period of time or getting a job on the business side. Both options will be good to getting the necessary experience.
  • Knowing research options / metrics to justify decisions is not always needed as not all companies expect this of their product owners. However for those product owners that do need to know the information, joining external groups in the industry, reading trade publications, staying on top of trends, working on the business side etc.. can all help to build up the knowledge required to justify decisions.

Are criminals and government fines driving new requirement methods?

We all have had the business partner who never quite tells us all that we need to know when working on a project but spare a thought for those who design solutions to defeat criminals. In their case, the criminal is not sharing what he does and most design is done in reaction mode.

One area that has got recent focus is Money Laundering. Financial Institutions that end up involved with Money Laundering not only risk loss of money and reputation but fines as well imposed by their governments or even other governments. The situation of identifying money laundering has gotten so out of control that nobody really knows how to define the complete requirements to identify money laundering.

Traditional requirement methods basically do not work anymore. With traditional requirement methods, the Business Analysts identify / capture the business rule and then implement it. Unfortunately the people who make the business rules are not the ones sharing it with us.

Criminals don’t tell us that if they do X, Y & Z then they are money laundering. The criminal’s desire is to float under the radar as normal customers. Their methods for appearing as ordinary customers have gotten so good that the people trying to create the rules after the fact to identify Money Laundering can no longer keep up. This puts Financial Organizations in a bit of a pickle.

Governments have made it so that Financial Institutions cannot just ignore the problem of Money Laundering and hope for the best. After all, if a Financial Institution goes belly up, it can affect a whole country. To avoid this worst case scenario, a government may force a Financial Institution out of business if they are not confident that the institution is compliant with laws. If you want to get business owners / board members to do something about a problem, threatening their business is one solid way to go about it and the governments know this.

For Financial Institutions to get round this issue of not knowing the business rules to implement to identify Money Laundering, they are turning to Artificial Intelligence (AI) to fill in the gap of knowledge. AI will scan through large amounts of data to learn, establish, monitor and update the business rules that identify Money Laundering or Potential Money Laundering. Systems will then implement the rules on the fly to freeze accounts, recover laundered money and notify government and law enforcement agencies.

While I am not a liberty to talk about the specific data being worked with, I can discuss what this means from a Business Analyst perspective. Data scientists and AI engineers will take over the role of capturing and implementing the business rules for identification of Money Laundering into the systems. Previously this was handled by the Business Analyst. However, before you cry about the loss of another piece of work for the Business Analyst role, new opportunities will open up:

  • AI needs data and lots of it. Business Analysts will be recruited to provide data interfaces into the AI machine. At least for the next little while. Eventually, the desire is to end up with more of a Web Crawler approach where the AI establishes new sources of data with little to no human intervention.
  • While AI will be good at identifying that an action is needed it will not be good at implementing the action (at least until we build the fictional “Skynet”). Business Analysts will always be involved with ensuring that the action is communicated to where it needs to be communicated and that any automatic response within an organization is performed . With the way Companies, Governments and Law Enforcement Agencies reorganize themselves on a regular basis it is unlikely that this will ever be a static solution. Given that changing environment, this should keep Business Analysts busy for a while.

What I think will be interesting in the future will be for the Data Scientists and AI engineers to be able to explain the reasoning of the AI for its decision that a particular event is Money Laundering. Eventually it could grow beyond their understanding. I can see a future where Business Analysts will be called upon to get the AI systems to pump out human readable reasoning and maybe that will be a new job task for us all.

In summary, criminals and governments are driving the need for AI to step in and generate IT requirements on the fly. This is to ensure that the criminals are kept in check and that businesses are not shut down by governments for not keeping criminals in check. While some BA roles will be lost around business rules capturing and implementing, other new roles will open up in support of the AI infrastructure and especially the output from the AI solutions.

Health Insurance & Tax impacting the American Consultant take home pay

This post is about unexpected costs that the American Government has brought to bear that are reducing the income of Consultant BAs in America.

If you work in America or thought of coming to work in America, certain changes over the past few years have really reduced the take home pay of consultants who are not self employed – known as W2 employees because of the end of year tax form received by employees.

For this post, I have assumed that the tax rate for the consultants income will be 30%.

To be self employed (known as 1099 – tax form produced at end of year for employees not directly employed), the client that you work for has to be willing to accept a Corp to Corp relationship and some clients will not accept that. If you are lucky you might be able to get a preferred vendor to hire you on self employed basis. At the end of the day, being able to work either W2 (employee) or 1099 (Self Employed) opens up the number of job opportunities that you can pursue.

How did we get to the point of losing money?

First off, Obamacare (Affordable Care Act) did not reduce the cost of health insurance for the majority. Instead, Obamacare has more than doubled the cost of health insurance since it was introduced – a $500 family policy in 2014 would cost around $1300 today based on what I have seen.

Obamacare also removed the ability to deduct private health insurance costs from your taxable income. Private health insurance was important for consultants as they change employers often. Buying into an employee policy did not always make sense as the relationship with the employer may be only for 3 months (duration of the contract). This private health insurance policy cost could previously be deducted against your taxes whether you were W2 or 1099. What Obamacare effectively did, was to get rid of that deduction option for W2 employees. This is like a 30% surcharge on health insurance because you now pay for the private health insurance policy with after tax income. 1099 employees can still deduct the cost of the health insurance against their income since the insurance can be deducted as a business expense. However, even if you are 1099, that does not overcome the fact that health insurance costs have more than doubled.

The second reason W2 consultants are losing more money is because of the tax law change of 2018. Previously if you had travel expenses that exceeded 2% of your W2 income, you could deduct that amount from your taxable income. This deduction was removed starting tax year 2018. This means that if you do any amount of heavy travel for work that your company does not pay you for, those travel expenses are now 30% more expensive. I spoke with a recruitment agency the other day who said that more and more consultants are declining W2 job opportunities that would require them to work out of their home town because of the tax law change.

If you are W2 and you need to travel for work overnight or longer, you could ask for “Per Diem”. This is a IRS approved amount that is not taxed and the amount is based on the location that you are working in. One way to negotiate this is to get an agreed rate and then ask for some of the income to be converted to Per Diem money. However there are flaws with this approach:

  • Per Diem expires after a period of time. Basically after your have been in a location for 12 months or you expect to be at a location longer than 12 months, the IRS no longer considers you eligible for tax free Per Diem.
  • You have to have a tax home more than 50 miles away from where you work.
  • The client or employer that you are going to work with may not want the hassle of doing Per Diem reporting.

In summary, the American system has started to close the door on W2 employees that are Business Analyst Consultants by increasing the cost to do business. Consultants will need to:

  • Pursue more Corp to Corp jobs (basically be self employed).
  • Work at reduced rates for companies that will cover their expenses.
  • Work only on local jobs that do not have expenses associated with them.

Self driving trucks increases risk of potential food and other consumables shortages

March 23, 2019 by · Leave a Comment
Filed under: General News 

As technological advances in the automotive industry bring us closer to the fully self driving vehicle on the roads, Governments would be wise to consider lessons learned from the airline industry. Business analysts should help advise our business partners on mitigating the risks.

Recent issue with Boeing 737 Max 8 highlights how a perceived software problem can kill people and cause a fleet of planes to be grounded from use. From a business perspective, airlines were lucky that they had other types of planes to use while the problem is worked on. Still, this was a tragedy that we all wish had not happened. We should use it as a reminder on being sure to understand the risks of any software we work on.

If we jump forward into the future where the delivery trucks on the road are all self driving. What will we do then if the self driving software is found to be flawed or hacked and requires trucks to be taken off the road? It might be days or weeks before the fleet can get operational again. This would leave food rotting in warehouses and docks as it was unable to be delivered to its final destination. This could lead to mass panic and civil unrest.

As a forewarning of the impact we might see, the cyber attacks of 2017 showed how information systems that organize the flow of goods were impacted. Shipping containers were unable to be moved to their destinations as the data required to manage the containers was unavailable. This lead to some temporary food shortages.

Self driving trucks, however take us into the world that actual physical transport is also at risk of being disabled. Even if we had a physical piece of paper showing us all the destination information for a shipping container, we would not have the means to move it. No manual work around. To avoid this risk, it would be good if self driving trucks at least have the following features:

  • Ability to disconnect the Self Driving brain from the truck.
  • Mechanisms to allow humans to control the truck directly in a manual form.
  • Retain physical security that allows authorized humans to manually drive and don’t rely on software security that may fail and prevent manual override.
  • Multiple vendors, with the theory that they won’t all fail at the same time.

If we want to look at history to see a previous example of when vehicles ground to a halt and how the problem was handled for the future, we can look at the Opec Oil embargo of 1973. Restriction of oil being sent to USA meant that fuel was hard to come by bringing traffic partially to a halt. Long term solution to avoid this problem in the future was for USA to keep its own strategic reserve of oil. One could argue for the need to keep sets of manually driven trucks on standby, spread throughout the country as a similar workaround.

As business analysts we should encourage our clients / business partners to weigh up the risk of their investments in new technology and help them to consider back up solutions at the same time. The old idiom of “Don’t put all your eggs in one basket” might be wise as software continues to replace even more manual processes. In some places it may be better to have multiple different solutions so that there is an alternative should one fail. Having a variety of planes has allowed the airlines to keep flying.

The more our technology solutions integrate with the infrastructure of the society we live in, the more need there is for a back up solution should a piece of technology fail. As business analysts we should not forget this.

Are we building the world defined by the movie “Gattaca”

March 14, 2019 by · Leave a Comment
Filed under: General News 

This post is about thinking how the project we are working on will impact the future of society as we know it. Quality of requirements we gather can impact society in ways later on that we did not think of. Most importantly, we need to always provide a secure back door where incorrect assumptions generated by computers can be overridden by humans otherwise we are at the mercy of the machines.

The 1997 movie “Gattaca” was about a person’s DNA being used to determine their potential in society. DNA would be used to work out what careers you would have access to. The protagonist of the story pays to use someone’s else DNA (adoption of identity instead of theft) to achieve their objective which currently they were banned from due to their inferior DNA.

If you think about the mechanics of the future life direction provided in “Gattaca”, it revolves around DNA having been previously analyzed to determine the career potential of a human being. Once the analysis was done, computers take control of processing an individual’s DNA to determine future worth to society. There is no need to have human involvement in the decision any longer as the formula is already developed and the outcome determined.

For those of us who work in IT we may encounter projects that are a cog in the works of this “Gattaca” future. These projects are efforts which seek to define a value for a human based on information received. Certainly at this point they are not focused on including DNA sampling to determine the value but that would seem to be only a matter to time. If we included DNA sampling that would give us the potential to look beyond the immediate prediction of value and include a future performance prediction as well.

Some of you may remember a 1995 movie called “The Net” where the issue was that the protagonist’s identity is erased causing her to have issues with doing anything in life. Having the computer store information related to our identity was the first step towards where we are now. What is significant in today’s world is how computers are adding points to that stored information that affect your value in society.

You may be asking at this point what IT projects that I work on would fall under this future “Gattaca” classification of determining human value? Below are some examples:

  • Human Resource systems – tracking of sick days and vacation days. Certain trends identified by the system can flag an employee as a risk.
  • Credit Scores. Speaks for itself and reliant on information that may or may not be correct.
  • Terrorist Name matching – how many Mr Smiths get delayed at the airport.
  • Computer Activity monitoring – not active enough and your employer can terminate you.
  • Grocery Store cards – are you buying healthy. Where does all that information go?
  • Job Interview software – companies are pushing more and more to remove the human from the initial interview loop and instead rely on a computer interview to screen applicants. Answer a question in the wrong way and you may never get passed the computer.
  • Job Search Engines – computer selects which resumes to be reviewed for potential interviews. If you don’t spend time trying to work out the current key words, your resume may never get viewed by human eyes.

Above just represents some of the projects that you may work on that determine the current value of a human. Underlying each one are formulas that are sold as: improvement in efficiency; reduction in costs etc.. for the organization that deploys the software. Side effect of course is that a computer now values the individual human based on the formulas used against the data received. While they don’t use DNA yet, they are certainly getting close.

We already know that in today’s world identity theft can give the stealing individual access to things that they would not have access to. That was the premise in “Gattaca” where the individual adopted another’s identity to achieve their goal. Identity theft would not be as easy if it were not for the computer storing points on an individual that determine their value to an organization and society thus opening or closing doors of opportunity. After identity theft, the victim may lose their place in society for a while if not permanently as their points total will be adjusted again by the computer based on the information generated by the identity theft. Victims are then tasked with reaching out to actual humans to correct what the computer is stating is valid. Humans tasked with correcting the information are at the mercy of the computers having a back door from which they can override the incorrect information.

So the next time you start to work on a project, ask yourself if you are building another “cog” in the “Gattaca” world machine and did you provide a secure back door override.

AI will end the need for IT Requirement Business Analysts

Will say up front that this is purely an opinion piece as I don’t plan to provide the textual references to back up the statements. Purpose is just to think about the evolution of technology that had led us to this point and the impact to the IT Requirements Business Analyst.

Historical speaking if we start at the Industrial revolution, machines were used to speed up production. Back then, the equivalent of a software programmer was a mechanical engineer who designed the levers; shafts and cogs etc.. to produce the desired result. More often than not, workers were required to keep the machines fed with raw material and to remove the finished product. You could think of the raw material as being the equivalent of data coming into a modern day data processing software and the end product being the finished use of the data such as reports / dashboards or account updates.

When the electronic computer moved into the office world in the late 40’s we started to see where the processing of data by human clerks was being replaced by the computer. The mission of the computer back in those days was to reduce the number of humans involved in processing data. At this point, businesses are not looking for the computer to give them guidance but just to allow data to be processed quicker and more cheaply. Designers of software could focus on the tasks already done by the human and create software and peripherals such as printers that replaced those tasks. This would be done by job shadowing to understand the process.

Moving into the fifties and sixties, computers became available that could be programmed with complex algorithms to allow data to be processed in a way that predictions could be made from the data. At this point, the designers are no longer thinking about replacing workers but instead leveraging the processing power of the computer to produce decisions related to business. While some of this you could argue was done in WW2 to break codes, those machines were specifically built for the task. What the more modern computers allowed was for programming languages to change the decision task to meet the current need. However the one handicap was the speed of the computer in those days. Decisions generated by the computers were not done in real time and the programming involved was complicated.

With the advent of the more powerful and useful computers that come out of the late sixties, we start to see where computers become part of real-time processing. Computing processing power and storage keeps increasing every year allowing for businesses to look at new way of saving costs / increasing investments by letting the computer streamline their processes beyond replacing people to even including the ability to expand their business. This is achieved by the development of new interfaces beyond the punch card of old. Now terminals are available that allow for direct access to the computer processor allowing for live updates of data – think airline ticket handling. In this period the designer was seeing how new tools available via technology could enhance rather than replace the current process. However even with all these advances, the use of the computer was still dependent on a designer working out the needs of the business and getting it coded. Software solutions were ridged and limited to the design parameters provided.

Even when we go into the turn of the century, the faster computers with more data handling are still moving along with limited software design that involves fixed parameters and limited interfaces for data collection.

Where we seem to make the evolutionary leap towards AI is when the processing power and data handling ability of computers crosses a threshold where it can consume non-human prepped data that is beyond just text. Previously, data processing power limited what a computer could do in real time. Now a computer can process not just plain textual data but also images, sound etc. and determine decisions based on this. It is like we have removed a prisoner from a small cell where the only thing they could see was text and their hearing / touch / sensing was deliberately disabled. Now we are in a scenario where computers can be educated to interface with humans on a natural level. All the designer needs to do is define the data streams (based on the context – driving a car for example) and the measures of success. AI can then learn to process the data.

Now before I talk about the impact of AI on the BA role, I want to break the role into two:

Role 1 is the BA that looks at business processes.

Role 2 is the BA that looks at interfacing IT with business processes.

Business process BAs (Role 1) are already heavily being replaced by the Product Owner role so while this BA will eventually not exist, the role has a chance of living on for a period of time with the advent of AI as a Product Owner. Eventually however even Product Owners will be replaced by more sophisticated AI solutions. Big risk though, is that businesses become clones of each other. An AI analyzing the marketplace may come up with the same opportunities as another business AI thus killing the market opportunity as both deliver the same solution at the same time making neither have an advantage over the other. While this happens with humans today, the occurrence is less as humans cannot deliver ideas at the 24x7x365 speed that AI can. Stock market meltdowns have already been shown to happen when multiple different stock monitoring software trigger sell decisions because a trigger event occurs. This will be the same issue when AI takes over the business Product Ownership.

Role 2 BAs that focus on requirements for IT design are most likely to be impacted in the very near term by the advent of AI. This role has already seen a significant amount of work being moved to UI designers and Data Warehousing Specialists (both of whom are at risk of being replaced by AI as well) that the amount of work left is limited. With the advent of AI, its is conceivably possible for the BA person to be replaced by an AI solution that interfaces directly with the business to produce either IT solutions or output that can be used to create IT solutions. For years this has been a dream of many companies. Easier to use software being the traditional way to limit IT expenses – think about how many business users use Excel for example without ever talking to a BA. AI will make it a reality for everybody to ditch the IT requirements BA. Instead of having to learn an interface specific to a piece of software, the AI will instead provide a sophisticated human interface that replicates what the BA does today. For the business it will be as if they are working with a BA but without the human cost. Certainly it will take time and money to develop this BA replacement AI but once it is developed it can easily be distributed and shared.

In summary, AI will be a great move forward for business but will negatively impact the Business Analyst market as we know it today. Business Analysts who focus on only IT requirements would be well advised to move into Product Owner roles or be involved with the AI development that is happening so that they can be an expert in the field.

Next Page »