Part II: In Defense Of Government

We find “The Government” to be a common punching bag for most folks – politicians and common citizens alike.

Our politicians rail against a corrupt and ineffective government pointing to us  the ills of big government and why the regulations imposed by the government are stifling our industry.

There are folks that talk about the deconstruction of an institution that has evolved over centuries of human development. It seems, everyone has an example of some egregious behavior that they use to justify painting the whole institution bad and in need for pruning. And the travesty of the situation is that there is no one standing on the other side defending this vital institution and its usefulness.

So let me take on the defense for the institution called the “Government”, which allows us to cooperate in very large numbers, prescribes and maintains a rule of law and is able to undertake projects at a scale that is impossible/unsustainable for an individual/family or a small group.

Governments came about when humans started gathering into communities. They were three primary reasons to create a government –

  • Establishing a common benchmark of behavior in society and establish conformance
  • Achieve scale where an individual/family or small group could not
  • Making outsize bets in pushing expertise in any domain – agriculture, industry or technology (for e.g. core science) where short or medium term benefits may not justify a rational private investment

Establishing the Rule of Law:

Humans felt the need to create rules for common behavior that all members within a community to adhere to. Anyone not adhering to these rules was given punishments or incarcerations. The institution developed as a checks and balances for keeping civil behavior.

Bringing the Benefits of Scale:

If you look historically, humans formed collectives and villages/towns/cities etc. to take advantage of the power of the collective. There were some projects/endeavors that were beyond the scope of an individual or a small group like a family to accomplish. Hence we humans invented the construct of “The Government” to allow us to cooperate in larger numbers. For e.g.  maintaining a military for offense or defense or build roads that are more than point to point connections and useful for the entire community.

Before we assign all the blame to the government and we dismantle it – we also should be willing to give up all the gains achieved by this institution.

Are we ready to give up on our military, or our highways, the internet, the GPS system, antibiotics and miracle drugs we have on the market. All of these are innovations that started as government projects and were then handed over to private enterprise.

 

I do not discount the criticism leveled by some that there are government agents that take advantage of their power, or even some individuals who are free riders on the rest of society. Yes, you can find bad apples in any group, but let’s not use these examples to discredit the institution and forgo the benefits of having a functioning and effective government.

 

Org Design: Governance Vs. Delivery

Benefits of keeping governance and delivery functions together

Over the weekend, one of my ex-colleague reached out to me seeking advice on an org design question – should you keep  governance and best practices functions within a delivery organization?

My take: You can either keep governance functions within your best delivery unit or create a separate governance organization but it won’t be successful without assigning it some critical delivery responsibilities.

 

Here’s an example of what has worked in my professional experience:

At Medco, while running the BPM COE, I was tasked with creating a structure that could parallelize development. We had a massive transformation project, with a scale up target of almost 1000 developers at peak.

We had applications for various products – like mail order dispensing, point of sale adjudications, specialty pharmacy etc. These applications included common workflow capabilities like order processing, customer advocacy, therapeutic resource centers etc. These we chose to implement as frameworks. There were multiple scrum teams working on parallel development. We created a governance group – the Corporate Agile COE that was responsible for orchestrating application delivery across these framework dev groups (COE’s) and the application work station groups (BIACs).  In addition to governance, this group also had some critical enterprise framework delivery responsibilities like authentication and authorization, PHI data access controls, single sign on, personalization framework, and service bus client-server framework.

While governance is a full time job, without delivery responsibilities, it does not carry enough creds to be effective.

Why?

Architecture and Governance should not become an “Ivory Tower”: It needs to be grounded, practical and implementable; Incentives are aligned between delivery and governance, so governance principles are light, not onerous and implementable. And the best way to prove that a design is implementable is to give the person/team proposing it a chance to implement it using the same guidance; The aim for both architecture and governance should be to be simple, rational, elegant and not onerous so as to impact delivery.

Eat your own dog food – hence establish credibility when prescribing your solution: The group prescribing architecture principles is able to demonstrate through their own delivery that it works, hence establishing credibility when prescribing solutions.

Architecture not a scape goat for delivery: Other delivery groups cannot claim that the architecture is unimplementable and thus make the arch and governance group a scapegoat for failed deliveries.

 

 

 

Virtualization – A Necessary Strategy For Any IT Exec

Enabling technologies (inventions) have brought about faster innovation – now that change is constant, we are all trying to outdo the last incremental change. IT Execs have to worry about faster speed to market, which has shrunk from months to now days and even minutes.

Remember the days when a project had to schedule hardware and software change – and if you missed it on your project plan, either you were running on borrowed capacity or running extremely crippled till hardware was ordered, arrived and was provisioned in the data center. Those days are gone; today, capacity on demand is the norm and “one click provisioning” is taken for granted.

This is the case with the large investment bank that I work for where we have our own flavor of cloud and on demand provisioning; even with with smaller enterprises that can use infrastructure cloud providers (AWS/Azure/Digital Ocean/Google etc.) to spin up containers and add capacity on demand.

Very recently while mentoring a nonprofit, I came across a situation where the founder of a dance studio was forced to work on her IT systems more than the organizations mission. Upon probing deeper, we discovered that while IT tools are excellent productivity drivers, a fragmented landscape of solutions is actually a bigger headache to manage than doing this work manually with a pen and paper. This was a problem of fragmented providers and needing a lot of IT savvy to merge and manage data from the owner and founder of the non profit.

We recommended  a virtualized and a consolidated software solution which was accepted enthusiastically. We ended up setting up a word press instance for her in a matter of hours. In fact my 7th grader son pitched in and set up the entire static site for her. Then one of our other volunteers added various plugins like mail chimp and class scheduling and campaign management to the solution.

This is true democratization of tech – it’s not just the monopoly of large orgs with an army of IT folks – anyone can experience this new paradigm and turbo charge their nonprofit/business.

Here’s a graphic that depicts the types of offerings in the market. The boxes in blue represent what is currently available.

Part I: In Defense Of Globalization

 

The world has changed drastically since the end of World War II. It has become much smaller and more connected. Advancements in science and technology have made it possible for communication, trade, and commerce to occur almost instantaneously. Trade barriers have been broken, international borders have been blurred, and the transfer of goods, people, and ideas happens daily across borders. This brave new world that was entered after the war is caused by a process called globalization. Globalization, according to Wikipedia, “is the action or procedure of international integration arising from the interchange of world views, products, ideas, and other aspects of culture.” Globalization has changed the world for the better, and it should become the standard for humanity. Its benefits include, but are not limited to, per capita and revenue growth, the spread of democracy through the developing world, and a general interconnectedness between the human species brought about by the social, technological, and ideological exchanges across the world.

Since the 1990s, trade agreements like NAFTA have made it easier for companies to move across borders and maximize revenues. Companies that were once limited to operations in very few countries were able to create jobs where labor was cheapest, resulting in more employment, and a rise in standard of living in poor countries, like China. China’s economy is a prime example of the economic benefits that globalization has produced. China has become a hub for manufacturing, with companies like Apple producing most of their products there. Since 1990, GDP per capita increased exponentially, going from USD 317.885 in 1990 to USD 8,027.684 in 2015. Higher individual wages also correlated with a higher total GDP. GDP in China went from 360.859 billion USD in 1990 to USD 11.008 trillion in 2015. This huge increase in economic value is thanks to the countless companies that have taken advantage of China’s booming population and workforce. This in turn raised incomes for the majority of the Chinese populace. Standards of living have been going up, and outbound tourism from China has been steadily increasing, from 4 million people in 1999 to 50 million people in 2015. China is just one example of the profound effects globalization on a population. As the winds of globalization inevitably sweep across the world, many developing countries will turn in to booming, prosperous nations. According to the Peterson Institute of International Economics, “A sophisticated model predicts that global free trade, removing all post-Uruguay Round barriers, would lift world income by $1.9 trillion ($375 billion for Japan, $512 billion for EU and EFTA, $537 billion for the US, $371 billion for developing countries, $62 billion for Canada, Australia, New Zealand).” This “global free trade”, brought about by trade deals like NAFTA and TPP, will enable labor to spread around the world and increase incomes across the board for citizens of the globe. However, globalization has come under fire from many people, including US President Donald Trump. His entire campaign was directed against globalization, claiming it took away jobs from “ordinary, hardworking Americans”. It is true, that free trade drives companies to shuffle jobs around different countries, potentially leaving some people behind. The key to solving this issue is adaptability. Globalization hastens the pace of innovation across industries, thanks in no small part to the input of ideas from around the globe. New innovations can create new avenues for moneymaking, thus creating newer jobs in the places that lost them. To maximize global revenue, trade deals between nations should make it easier for companies to move across borders, all while keeping a balance of power between people, corporations, and governments. In fact, trade deals like these should include councils of representatives from corporations, governments, and unions, enabling openness and transparency in all facets of the economy. If such a system is implemented on a global scale, the prosperity of the human race will soar.

Why Is Innovation Necessary? A Model Based Approach To Drive Growth…

I came across an interesting economic growth model, while auditing Scott Page’s “Model Thinking” class at coursera. Even though the models he discussed are relevant to economic growth for countries and why certain countries grow vs. stagnate, I felt that this is very relevant to the investment decisions that technology leaders make every day.

Here’s a summary of the basic growth model Scott discusses – even though its a very simple model, it brings out the core concepts that you have to weigh when allocating your project investments between RTB (Run the Bank) and CTB (Change the bank).

The model is set up on an island that has workers and coconuts. You can create picking machines out of coconuts (investment) that helps increase your output. Machines are lost at the rate of depreciation. So formalizing –

L= Workers at time t

Mt= Machines at time t

Ot= Output in coconuts

E= Coconuts consumed at time t

I= Number invested at time t

s = savings rate

d = depreciation rate

Assumption 1: Output is increasing and concave in labor and machines (capital)

Ot = √Lt * √Kt

Generalizing then –

Ot = Lt (1-β) * Ktβ

Assumption 2: Output is either consumed or invested

Ot = Et + It

Assumption 3: Machines can be built to increase output (using a concave function) but depreciate

Mt+1 = Mt + It – dMt

Assumption 4: Investment for the next period is output times the savings rate

  It+1 = Ot * s

Since the output is a concave function,  investment has diminishing returns on the output.
To achieve a long term equilibrium,  investment would need to equal depreciation i.e.

Investment (Ot * s) = depreciation (Mt* d)

Since Depreciation is linear and the output in our model is a concave function, you would expect to see the following mapping for these effects:

So at a certain stage, because adding more machines is not going to produce enough output to sustain loss from depreciation, growth stagnates. The irony of this growth Model is that growth stalls or stops as the effect of depreciation becomes greater than the output produced through additional investment in new machines.

Next let’s consider the Solow’s growth model, that allows us to introduce innovation into the mix and addresses how to overcome this growth stagnation.

Ot = At * Lt (1-β) * Ktβ

where

O= output at time t

At = Innovation constant

L = Labor at time t

Kt =Capital invested at time t

Here the key investment choices that can be made by a senior stakeholder is how to divide up the investment between automation (driving up Kt) or game changing Innovation (At); and in my experience you are headed for stagnation or failure without carving out a portion of your budget to invest into innovation rather than just automation!

Here Kt would represent your RTB budget while a portion of CTB (elements dedicated to innovation) would essentially push the innovation constant higher. The key question you have to grapple with is the distribution between the two. And every organization has its own appetite towards making such division between these two types of funding. A few key factors driving such decisions are –

– Organization culture

– Level of competition within the industry

– Risk appetite for the senior executive team

– Whether you are a listed on a public exchange or privately held

Thoughts? Would love to hear your experience in budget allocations and the kinds of biases or preferences you have experienced. Please comment below or send me a direct email.

Responsibilities of a Product Owner

It is increasingly apparent that technology has become a large driver of business strategy. By providing new channels and mechanisms to interact with its customers, clients, partners and suppliers, technology is a key part of an organization’s sustainable competitive advantage. So its imperative for a product owner to not just understand the business landscape but also understand how he or she can use technology components as key artifacts in differentiating their offering.

Functional Responsibilities:

There have been a number of blogs which have focused on requirements that are intuitable, usable, simple and efficient. So I will not cover this aspect here.

Non Functional Responsibilities:

Security:

I cannot say enough on making sure your product is safe and secure for your users and their data and this focus is not only to comply with regulations like HIPAA but to stay out of the front page of the Wall Street Journal. Given the high profile breaches we have seen, it is imperative that your product has a plan for the following:

  • Encrypted data stores
  • Multi factor user authentication
  • Allow only encrypted access to your app or website
  • Std DMZ based protection for your applications
  • Regular security patch management program, software upgrades and periodic vulnerability assessment including pen tests and adhering to the OWASP Top Ten list.

Business Environment and Model Resiliency:

As a product owner you should keep an eye on the business environment and your competition, so that your business or application is not upended by a competitor introducing a game changing innovation or the market changing to be unfavorable to your product.

When I was working for a mail order pharmacy, we were able to use our functional components to rearrange the value chain and disrupt the pharmacy fulfillment market. Mail order pharmacy is a volume business and we used benefit plan design, formulary design and volume discounts to drive our profitability. The model did not involve engagement with a patient unless doing a coverage review or a generic substitution. We had implemented digitization of the prescription (i.e. separating the cognitive and the fulfillment parts of the process) to be able to route this to the most efficient pharmacist and/or robotic pharmacy to fill. Our competition – retail pharmacies on the other hand were local and a pharmacist at these retail pharmacy would have a one on one relationship with the patient. When we introduced the TRC (Therapeutic Resource Center) model, we were able to use our routing components to our advantage. We created these centers that were focussed on a specific disease condition (oncology, cardio-vascular, diabetic, neuro-psych, specialty etc.) and were able to route our chronic and complex patients’ calls and fills to these centers. The engagement that we got with these patients was a lot better than retail pharmacies, and we were able to drive mail order penetration using our plan designs by proving to our clients that we were able to bend the healthcare cost curves.

So when you are designing your product, keep your design flexible, so you can adapt if the environment or your competitor’s strategy changes.

Configurability:

Ask for your product to be configurable rather than needing code releases every time there needs to be a change. An example is the regulatory framework we designed for our on-boarding platform. Since my investment bank operates in multiple jurisdictions and there are frequent changes in regulations for any single jurisdiction; it was very beneficial for us to use a regulatory program framework with the rules extracted out of the application; These rules are user modifiable with the right approval workflow. This has given us a huge advantage in being nimble with regulatory changes and not have to wait for IT release cycles and change windows.

Modularize and Encourage Plug And Play Components:

When you modularize your system – like lego blocks representing various functions, you have the ability to introduce a champion-challenger competition between these blocks. This allows for increasing product innovation through constant improvements because of competition between the modules.

There are new capabilities being added every day in ML (machine learning) and AI (artificial intelligence). You want to have a flexible architecture so that you can plug these modules in anytime to improve the learning and adaptability of your apps.

Data Processing Variety:

Always design your application to be able to handle a multitude of data streams with different types of variety, velocity and volume characteristics. What I mean by this is your functional modules should be able to handle both structured (db, partner systems etc.) and unstructured data (say social media feeds/blogs). Given the growth in IoT devices, make sure you are able to source data from these devices and crunch them realtime to have a much more agile and realtime responses for your users.

Portability:

Be cloud ready – i.e. a portable application that is able to shift its compute and storage resources at any time. This is key for scaling up or from a business continuity planning perspective. Say you lose that data center that hosts your application. Are you able to port your application seamlessly to the disaster recovery compute / storage resources?

Platform Agnostic:

Technology obsolescence is a huge risk. Design your systems using std patterns, interfaces and open source components.  If one component goes out of its useful life or technical support,  it can be replaced by an equivalent functional component that isn’t obsolescent -this is not a question of “if” but “when”.

Scalability:

Your product needs to be able to scale up if there is a sudden rush of customer interest. It is useful to have a capacity on demand model, where you can dial up or down based upon market conditions. A number of cloud hosting providers have an elastic compute model offering that you can leverage.  For e,g, this blog site is hosted on an elastic offering with the ability to scale up or down depending upon demand.

 

SDLC Responsibilities:

Honest Broker for all Stakeholders:

A product owner is usually responsible for prioritizing the requirements from various business units and stakeholders into a single product. He or she must be an honest broker managing the interests and prioritization across these stakeholders in a fair, transparent and honest manner.

Promote Agile Development Practices:

Adopting an agile development methodology allows for iterative delivery where the development team has the benefit of frequent and constructive feedback and able to adapt to a dynamic market with rapidly changing demands.

Define Success and Measure Achievement:

Be able to define the metrics measuring application functionality, scalability and delivery before commencement of the scrum sprint. Should also be able to measure, communicate and put plans in place to address missing any of these goals.

IT Industry Evolution

Some History

As humans went from hunter gatherers to agriculture, they settled down in small villages and hamlets. Once agriculture took care of food production, people could focus on specializing their skills – like being a shoe maker or a barber or an iron smith. They then bartered their services for food and products that they themselves did not produce. Next came the advent of money which brought about efficiency in transactions over bartering. After that came the industrial revolution with a large scale focus on mechanization. This was the era where machines powered actions that a human or a group of humans could not do and helped improve the efficiency of a single worker working in a job shop. This human history is very well documented in the Sapiens by Yuval Noah Harari.

The next level of efficiency was achieved by moving from “a job shop” production unit to an assembly line. Henry Ford is credited with introducing this innovation which brought about standardization and higher output. As production moved to assembly lines, there was the need to measure and standardize each action station to improve quality and throughput.

As industry specialized and process engineering became a science, data collection to fine tune processes whether to increase production or to reduce the cash cycle for a firm became important. Hence we saw industries investing in computing technology & resources. Now the trend is to not just eke out efficiencies, but to collect data from the environment or market to influence strategies on how to take advantage of these changes in preferences of consumers and to some extent even shape or mold user preferences.

How has the IT Industry Evolved?

Trends in Computing

Mainframe and Monolithic Architectures:

The first computers were large scale machines with very limited computing power. These mainly were invented as an academic project which later found application in data crunching within industry. Initially only the largest industries could afford it, and run it. I am sure you would remember the punch card drives and the huge cooling towers around the mainframes!

Evolution of Distributed Systems & Client Server architecture:

Since the 80’s we have seen an explosion in the speed at which information can be collected, processed and disseminated. Some advances came from the mainstreaming of IT into every business or organization. Initially the target was automation – simple enterprise systems that were automated for e.g. The production, planning, accounting, sales etc. There were a number of different flavors of distributed platforms that had specific appeals to different sets of users – the windows platform that was very popular in business user computing segment, the mac platform that appealed to an individual user with needs for creative art applications and the unix/linux platforms which appealed to the geeks. Eventually as we saw these platforms compete we saw the linux/java stack start to dominate the back end processing at most business enterprises, while the front end remained windows based and Apple made a big dent into the personal computing segment.

These led to centralized databases which facilitated the collection and analysis of historical data to make processes more efficient. A number of database structures evolved from network to hierarchical, then relational and object databases evolved as a result.

Web Infrastructure and Interconnection of computers

In the 90’s we had the development of the world wide web, with computers forming a connected web, with standardized communication protocols like TCP/IP, HTTP, SMTP etc.  This was a huge improvement to the unconnected islands that businesses and users had maintained prior to this. This really improved the velocity of information travel – from copying data to floppy drives and moving from computer to computer, to directly transmitting information from one computer to another when every node became addressable and ready to understand communication over standard protocols.

Development of Cloud Infrastructure:

In the 2000’s the trend moved to virtualization and the ability to run multiple processing slices on any physical machine. Computing became fungible and transferable. The idea was if every one was running their own physical servers which were not highly utilized, it would be better to have highly fungible compute and storage slices that could move around virtually to the least busy node, thereby improving efficiency multi folds for our computing and storage hardware.

IoT (Internet of Things)

We have also seen the shrinking of compute hardware to such an extent that this hardware can be embedded in any device or appliance. So a washing machine or a refrigerator may have enough and more computing power as a specialized computer from a few years back, implies each of these devices are capable of producing process data that can be collected and analyzed to measure efficiency or even proactively predict failures or predict trends.

Decentralized Value Exchange:

A parallel development has been the  paradigm shift in creating the ability to transfer value instead of just information. This came about from a seminal paper by Satoshi Nakamoto and the origins of block chain (more on this in another post).

Some call it a fundamental shift in philosophy where we do not need to depend on a central store of value (or authority) to establish the truth. It tackles the double spend problem in a unique and novel way without relying on this central agent. This has spawned applications in various sphere’s like digital currency, money transfer, smart contracts etc. that will definitely change the way we do business.

Big Data Computing

Today the amount of data has become so overwhelming that we find traditional centralized database architectures unable to keep up. This has resulted in a number of new architectures like Hadoop with its own map reduce algorithms and its family of peripheral components/applications like HDFS (Hadoop Distributed File System), Hive (interpreter that turns sql into MR code), PIG (scripting language which gets turned into MR jobs), Impala (sql queries for data in an HDFS cluster), Sqoop (convert data from traditional relational DB into an HDFS cluster), Flume (injesting data generated from source or external systems to put on the HDFS cluster), HBase (realtime DB built on top of HDFS), Hue (graphical front end to the cluster), Oozie (workflow tool), Mahout (machine learning library) etc.

Given this explosion of data that is being produced, collected, processed and acted upon which is beyond human capabilities, we have had to resort to delegating this to machines to collect, process and make sense of these trends – hence the renewed focus on machine learning and artificial intelligence.

Data Analytics and Usage of various Modeling Tools

Given the volume, velocity and variety of data we deal with, its humanly not possible for humans or organizations to analyze this steady stream of data and make sense of it. Hence we have started to construct models to analyze and make sense of this. There are a number of tools available that visualize and provide insights into this data and they inherently use a best fit model that is able to fit to existing data as well as provides predictive value to extrapolations of the causative variables.

This has become so ubiquitous in our lives that everything from credit scores to teacher ratings to how an employee is rated at work all depend on models. One of the critical insights in all this is that our inherent biases get encoded into these models. Hence we need to be careful to not trust these models without establishing their fairness. The best defense that we can employ is transparency and negative feedback control loops that help correct these models for accuracy. Cathy O’Neil has analyzed this very phenomenon in her book Weapons of Math Destruction – a delightful read!

Artificial Intelligence and Machine Learning:

Artificial intelligence is intelligence exhibited by machines. AI attempts to solve the following categories of problems using various methods (statistics, computational intelligence, machine learning or traditional symbolic AI) to achieve goals like social intelligence, creativity and general intelligence.

 

  • Reasoning
  • Knowledge Representation
  • Planning
  • Learning
  • Natural Language Processing
  • Perception
  • Robotics

There have been a number of efforts into machine autonomous behavior for e.g. autonomous driving cars, that use a variety of sensors like cameras and radars to collect information about the road and other vehicles and make real time decisions about control of the car.

 

Machine Learning problems can be broadly divided into supervised learning (where we have a body of data that a computer can use to learn and mimic), unsupervised learning (where the computer tries to make sense of patterns in a sea of random data whether through classification or clustering) and reinforcement learning (when a program interacts with a dynamic environment to perform towards a certain goal without an explicit teacher). A general categorization of machine learning tasks are as follows – classification, regression, clustering, density estimation and dimensionality reduction.

Is the ultimate design the creation of a self aware machine that can compete with humans for survival? Will this be a symbiotic relationship or a competition for survival? Are we simply a tool in the evolutionary process playing our part in creating a smarter, better and more resilient new being?

Fake News and Information Velocity

Trends in Information Dissemination: Speed of Information Transmission is increasing at a very rapid pace and veracity (the source, the “Truthiness” of the information (thank you Stephen Colbert!), or if it has been tampered with en-route) has not .

Initially information travelled as fast as humans could travel – A courier would take a document or a piece of information and use a fast horse or chariot to convey it from one location to another. Then humans invented other modes of transmission of signals – carrier pigeons, smoke signals, sound/light transmissions within line of sight and then we moved on to using an electric current (telegraph, telephone) and eventually using light or or radio or microwave, VHF, UHF transmissions. We also learnt how to use satellites for communication when the curvature of the earth’s surface got in the way of line of sight transmission. Mostly this was point to point with respected agents in the middle establishing the veracity of the information – like newspapers, radio stations, tv stations etc.

Meanwhile in the early part of this century, we had the advent of a number of social networks like Facebook, LinkedIn, Twitter, SnapChat, YouTube, MySpace  etc. These networks connected individuals into social networks and facilitated an extreme agility in information transmission. Think  how easy it is to share an update or a personal opinion on these networks. It has developed to such an extent that it is possible to rapidly disseminate false news with real consequences.

Now it has become possible for anyone to act as an agent and share any content however authentic or inauthentic with his network. In a highly interconnected social network, this results in sacrificing veracity.

Take for example this specific site – focusnews.us, that very often shows up on my Facebook feed.

This site usually runs incendiary stories with provocative headlines like the following (I have only attached news items that showed up on my Facebook feed today because someone in my network liked it, or shared it or commented on it). Here’s a commentary of someone’s personal opinion disguised as news and sensational clickbait:

 

And having a few catalysts in any social network – implies a news story can be disseminated very easily without the usual veracity requirements as a traditional news organization.

The effect of rapid information dissemination and network amplification that helps false news propagate and have a much more profound effect on public perception.

I looked at their website to understand if they had any journalistic credentials. I found none – no names for the management team, the editor or any journalistic team.  The Contact Us section provided an anonymous email address (world@focusnews.us) that did nothing to reassure me of their credentials. So I am assuming this is a clickbait operation, with a goal to generate ad revenues by posting sensational stories on social networks and getting people to click on them. But I haven’t seen any ads on their site – so what should I assume? Is this a disinformation operation that is being fed to a gullible american electorate who loves their conspiracy theories!

When I looked up the internet registration record for this site, it appears to be registered in Macedonia and regularly publishing unverified US news.

RAW WHOIS DATA

Domain Name:                                 FOCUSNEWS.US

Domain ID:                                   D55884931-US

Sponsoring Registrar:                        GODADDY.COM, INC.

Sponsoring Registrar IANA ID:                146

Registrar URL (registration services):       whois.godaddy.com

Domain Status:                               clientDeleteProhibited

Domain Status:                               clientRenewProhibited

Domain Status:                               clientTransferProhibited

Domain Status:                               clientUpdateProhibited

Variant:                                     FOCUSNEWS.US

Registrant ID:                               CR255879236

Registrant Name:                             ivan stankovic

Registrant Address1:                         JNA 18

Registrant City:                             kumanovo

Registrant State/Province:                   macedonia

Registrant Postal Code:                      1300

Registrant Country:                          MACEDONIA, THE FORMER YUGOSLAV REPUBLIC OF

Registrant Country Code:                     MK

Registrant Phone Number:                     xxxxxxxxxxxxxxxxxxxx

Registrant Email:                             @gmail.com

Registrant Application Purpose:              P3

Registrant Nexus Category:                   C11

Administrative Contact ID:                   CR255879238

Administrative Contact Name:                 ivan stankovic

Administrative Contact Address1:             JNA 18

Administrative Contact City:                 kumanovo

Administrative Contact State/Province:       macedonia

Administrative Contact Postal Code:          1300

Administrative Contact Country:              MACEDONIA, THE FORMER YUGOSLAV REPUBLIC OF

Administrative Contact Country Code:         MK

Administrative Contact Phone Number:         xxxxxxxxxxxxxxxxxx

Administrative Contact Email:                 @gmail.com

Administrative Application Purpose:          P3

Administrative Nexus Category:               C11

Billing Contact ID:                          CR255879239

Billing Contact Name:                        ivan stankovic

Billing Contact Address1:                    JNA 18

Billing Contact City:                        kumanovo

Billing Contact State/Province:              macedonia

Billing Contact Postal Code:                 1300

Billing Contact Country:                     MACEDONIA, THE FORMER YUGOSLAV REPUBLIC OF

Billing Contact Country Code:                MK

Billing Contact Phone Number:                xxxxxxxxxxxxxx

Billing Contact Email:                        @gmail.com

Billing Application Purpose:                 P3

Billing Nexus Category:                      C11

Technical Contact ID:                        CR255879237

Technical Contact Name:                      ivan stankovic

Technical Contact Address1:                  JNA 18

Technical Contact City:                      kumanovo

Technical Contact State/Province:            macedonia

Technical Contact Postal Code:               1300

Technical Contact Country:                   MACEDONIA, THE FORMER YUGOSLAV REPUBLIC OF

Technical Contact Country Code:              MK

Technical Contact Phone Number:              xxxxxxxxxxxxxxxx

Technical Contact Email:                      @gmail.com

Technical Application Purpose:               P3

Technical Nexus Category:                    C11

Name Server:                                 NS1.FOCUSNEWS.US

Name Server:                                 NS2.FOCUSNEWS.US

Created by Registrar:                        GODADDY.COM, INC.

Last Updated by Registrar:                   GODADDY.COM, INC.

Domain Registration Date:                    Fri Oct 21 22:24:17 GMT 2016

Domain Expiration Date:                      Fri Oct 20 23:59:59 GMT 2017

Domain Last Updated Date:                    Mon Oct 31 15:37:59 GMT 2016

DNSSEC:                                      false

 

 

If people are gullible enough to fall for this clickbait and generate traffic for this website, there is nothing wrong with this – this lets them push advertising based on site traffic and a couple of kids making money off this,it is a perfect example of capitalism.

And who knows if they are also being paid by any foreign government to carry out a disinformation campaign – oh wait – thats not true – right – we just heard from Putin that they would never do so…so obviously this line of reasoning is wrong…

And if my fellow Americans are gullible enough to fall for these clickbaits and provide these entrepreneurs with a steady revenue stream – so be it. But let’s not call it news anymore….

 

Do also check out this report on how fake news works (from media matters.org) –

 

Reading List

This is a book list I have created for my kids – I highly recommend these…Sid has finished these and Shaurya is progressing. Will keep adding more as I come across ones I recommend.

  1. Elon Musk: Tesla, SpaceX, and the Quest for a Fantastic Future – Ashley Vance
  2. Sapiens: A Brief History of Humankind – Yuval Noah Harari
  3. Homo Deus : A Brief History of Tomorrow – Yuval Noah Harari
  4. The Gene : An Intimate History – Siddhartha Mukherjee
  5. The Emperor of All Maladies: A Biography of Cancer – Siddhartha Mukherjee
  6. The Body Builders : Inside the Science of the Engineered Human – Adam Piore
  7. The Wright Brothers – David McCullough
  8. The Road to Character – David Brooks
  9. Hit Makers – Derek Thompson
  10. Brave New World – Aldous Huxley
  11. President Barack Obama – David Blum
  12. The Art of War – Sun Tzu
  13. The Hitchhikers Guide to the Galaxy – Douglas Adams
  14. A Path Appears – Nicholas D Kristof and Sheryl Wudunn
  15. Steve Jobs – Walter Isaacson
  16. Lord of the Rings Series – J R R Tolkien
  17. Harry Potter Series – J K Rowling
  18. David and Goliath – Malcolm Gladwell
  19. The Audacity of Hope – Barack H Obama
  20. Dreams from my Father – Barack H Obama
  21. The Power of Habit – Charles Duhigg
  22. Weapons of Math Destruction – Cathy O’Neil
  23.  1984 – George Orwell
  24. Animal Farm – George Orwell
  25. Fahrenheit 451 – Ray Bradbury
  26. Between the World and Me – Ta-Nehisi Coates
  27. Underground Railroad – Colson Whitehead
  28. God : A Human History – Reza Aslan

And these are some that I recommend from my professional life –

  1. The Startup of You – Reid Hoffman/ Ben Casnocha
  2. Principles : Life and Work – Ray Dalio
  3. Smarter Faster Better : The Transformative Power of Real Productivity – Charles Duhigg
  4. Influence Without Authority : Allan R. Cohen & David L. Bradford
  5. The Situational Leader – Dr. Paul Hersey
  6. Leadership and the One Minute Manager – Ken Blanchard, Patricia Zigarmi & Drea Zigarmi
  7. The Black Swan – Nassim Nicholas Taleb
  8. Anti Fragile – Nassim Nicholas Taleb
  9. Skin in the Game – Nassim Nicholas Taleb
  10. The Back of the Napkin – Dan Roam
  11. Signal and the Noise – Nate Silver
  12. Flash Boys – Michael Lewis
  13. Dark Pools – Scott Patterson
  14. The Goal – Eli Goldratt

Difference in motivating your team – healthcare vs. finance

You may need different techniques for motivating your team in healthcare and finance domains. Here’s some experience from the field.

In the Healthcare domain, when I was running projects in the field – we were able to connect the project outcomes to a patient’s health. For e.g. on a project where we were automating information collection (using a tablet coupled with a workflow tool – filenet) for a nurse visit for a patient infusion – we could correlate the accuracy of the data collection and the subsequent real time interventions we predicted to a reduction in ADE (Adverse Drug Events). It was certainly very motivating for the team to link the project outcomes to better patient health and the ability to save a life.

On another project “Therapeutic Resource Centers (TRC)” – this was similar to organizing our pharmacy and call center staff in a cluster of different cells specializing in a certain disease condition like cardiovascular or diabetes or neuro-psych or specialty. We were able to use a patient’s prescription history and medical claims to stratify them to a TRC. Once we achieved this, for any inbound contact from the patient for either a prescription or a phone call, we were able to route them to our appropriate therapeutic resource center cell. This resulted in a much more meaningful conversation with the patient regarding their health including conversations about adherence and gaps in therapy that had a direct impact on patient health. We were then able to use actuarial studies to prove the value of this model and introduced innovative products in the market to drive mail order dispensing at a much higher rate within our drug benefit plans to employers and health plans while promising for an overall reduction in healthcare spend and patient health.

In the finance domain its a lot harder to connect the outcomes of your projects to a fulfilling objective like patient health. Instead I have used a couple of different techniques to motivate my teams –

Close engagement with the business working with the regulators to understand the impact of their work. For example when we ran the risk ranking project, what motivated the team was their working very closely with the business in being able to visualize our regulatory rules and to be able to easily explain why we arrived at the result we had. What was motivating was also the fact that my team had come up with a way of visualizing the rules, that the business had not experienced before. We also were able to provide a complete audit history of how the risk ranking had changed over time by using a bi-temporal model to store our evaluation results and reason codes and the change in which client characteristics led to our evaluation of a different risk metric for the client. The business was a lot more comfortable in our annual sign offs after this visualization of the rules and audit records being available when ever we needed them for a regulatory audit.

A second way that I have used for motivating my team is being able to tie in the flexibility of our systems to the rapid changes in the regulatory landscape. Once we have a good design in place – that was able to extract our business rules out of the bowels of a program, these became a lot more accessible to change outside of the usual code release window. We were able to change our rules using a separate change cycle as opposed to our regular monthly or quarterly code release cycle – and be able to change things like high risk jurisdiction countries at a much faster pace (as needed and approved by the business). Being able to tie in our design to reduction of capital needed for our projects and also having to avoid the rapid code changes into production for every rule change definitely helped team morale as they were not constantly supporting emergency changes related to a rapidly changing regulatory landscape.