It is increasingly apparent that technology has become a large driver of business strategy. By providing new channels and mechanisms to interact with its customers, clients, partners and suppliers, technology is a key part of an organization’s sustainable competitive advantage. So its imperative for a product owner to not just understand the business landscape but also understand how he or she can use technology components as key artifacts in differentiating their offering.
Functional Responsibilities:
There have been a number of blogs which have focused on requirements that are intuitable, usable, simple and efficient. So I will not cover this aspect here.
Non Functional Responsibilities:
Security:
I cannot say enough on making sure your product is safe and secure for your
users and their data and this focus is not only to comply with regulations like HIPAA but to stay out of the front page of the Wall Street Journal. Given the high profile breaches we have seen, it is imperative that your product has a plan for the following:
- Encrypted data stores
- Multi factor user authentication
- Allow only encrypted access to your app or website
- Std DMZ based protection for your applications
- Regular security patch management program, software upgrades and periodic vulnerability assessment including pen tests and adhering to the OWASP Top Ten list.
Business Environment and Model Resiliency:
As a product owner you should keep an eye on the business environment and your competition, so that your business or application is not upended by a competitor introducing a game changing innovation or the market changing to be unfavorable to your product.
When I was working for a mail order pharmacy, we were able to use our functional components to rearrange the value chain and disrupt the pharmacy fulfillment market. Mail order pharmacy is a volume business and we used benefit plan design, formulary design and volume discounts to drive our profitability. The model did not involve engagement with a patient unless doing a coverage review or a generic substitution. We had implemented digitization of the prescription (i.e. separating the cognitive and the fulfillment parts of the process) to be able to route this to the most efficient pharmacist and/or robotic pharmacy to fill. Our competition – retail pharmacies on the other hand were local and a pharmacist at these retail pharmacy would have a one on one relationship with the patient. When we introduced the TRC (Therapeutic Resource Center) model, we were able to use our routing components to our advantage. We created these centers that were focussed on a specific disease condition (oncology, cardio-vascular, diabetic, neuro-psych, specialty etc.) and were able to route our chronic and complex patients’ calls and fills to these centers. The engagement that we got with these patients was a lot better than retail pharmacies, and we were able to drive mail order penetration using our plan designs by proving to our clients that we were able to bend the healthcare cost curves.
So when you are designing your product, keep your design flexible, so you can adapt if the environment or your competitor’s strategy changes.
Configurability:
Ask for your product to be configurable rather than needing code releases every time there needs to be a change. An example is the regulatory framework we designed for our on-boarding platform. Since my investment bank operates in multiple jurisdictions and there are frequent changes in regulations for any single jurisdiction; it was very beneficial for us to use a regulatory program framework with the rules extracted out of the application; These rules are user modifiable with the right approval workflow. This has given us a huge advantage in being nimble with regulatory changes and not have to wait for IT release cycles and change windows.
Modularize and Encourage Plug And Play Components:
When you modularize your system – like lego blocks representing various functions, you have the ability to introduce a champion-challenger competition between these blocks. This allows for increasing product innovation through constant improvements because of competition between the modules.
There are new capabilities being added every day in ML (machine learning) and AI (artificial intelligence). You want to have a flexible architecture so that you can plug these modules in anytime to improve the learning and adaptability of your apps.
Data Processing Variety:
Always design your application to be able to handle a multitude of data streams with different types of variety, velocity and volume characteristics. What I mean by this is your functional modules should be able to handle both structured (db, partner systems etc.) and unstructured data (say social media feeds/blogs). Given the growth in IoT devices, make sure you are able to source data from these devices and crunch them realtime to have a much more agile and realtime responses for your users.
Portability:
Be cloud ready – i.e. a portable application that is able to shift its compute and storage resources at any time. This is key for scaling up or from a business continuity planning perspective. Say you lose that data center that hosts your application. Are you able to port your application seamlessly to the disaster recovery compute / storage resources?
Platform Agnostic:
Technology obsolescence is a huge risk. Design your systems using std patterns, interfaces and open source components. If one component goes out of its useful life or technical support, it can be replaced by an equivalent functional component that isn’t obsolescent -this is not a question of “if” but “when”.
Scalability:
Your product needs to be able to scale up if there is a sudden rush of customer interest. It is useful to have a capacity on demand model, where you can dial up or down based upon market conditions. A number of cloud hosting providers have an elastic compute model offering that you can leverage. For e,g, this blog site is hosted on an elastic offering with the ability to scale up or down depending upon demand.
SDLC Responsibilities:
Honest Broker for all Stakeholders:
A product owner is usually responsible for prioritizing the requirements from various business units and stakeholders into a single product. He or she must be an honest broker managing the interests and prioritization across these stakeholders in a fair, transparent and honest manner.
Promote Agile Development Practices:
Adopting an agile development methodology allows for iterative delivery where the development team has the benefit of frequent and constructive feedback and able to adapt to a dynamic market with rapidly changing demands.
Define Success and Measure Achievement:
Be able to define the metrics measuring application functionality, scalability and delivery before commencement of the scrum sprint. Should also be able to measure, communicate and put plans in place to address missing any of these goals.


power. These mainly were invented as an academic project which later found application in data crunching within industry. Initially only the largest industries could afford it, and run it. I am sure you would remember the punch card drives and the huge cooling towers around the mainframes!
can be collected, processed and disseminated. Some advances came from the mainstreaming of IT into every business or organization. Initially the target was automation – simple enterprise systems that were automated for e.g. The production, planning, accounting, sales etc. There were a number of different flavors of distributed platforms that had specific appeals to different sets of users – the windows platform that was very popular in business user computing segment, the mac platform that appealed to an individual user with needs for creative art applications and the unix/linux platforms which appealed to the geeks. Eventually as we saw these platforms compete we saw the linux/java stack start to dominate the back end processing at most business enterprises, while the front end remained windows based and Apple made a big dent into the personal computing segment.
forming a connected web, with standardized communication protocols like TCP/IP, HTTP, SMTP etc. This was a huge improvement to the unconnected islands that businesses and users had maintained prior to this. This really improved the velocity of information travel – from copying data to floppy drives and moving from computer to computer, to directly transmitting information from one computer to another when every node became addressable and ready to understand communication over standard protocols.
processing slices on any physical machine. Computing became fungible and transferable. The idea was if every one was running their own physical servers which were not highly utilized, it would be better to have highly fungible compute and storage slices that could move around virtually to the least busy node, thereby improving efficiency multi folds for our computing and storage hardware.
this hardware can be embedded in any device or appliance. So a washing machine or a refrigerator may have enough and more computing power as a specialized computer from a few years back, implies each of these devices are capable of producing process data that can be collected and analyzed to measure efficiency or even proactively predict failures or predict trends.
transfer value instead of just information. This came about from a seminal paper by Satoshi Nakamoto and the origins of block chain (more on this in another post).
depend on a central store of value (or authority) to establish the truth. It tackles the double spend problem in a unique and novel way without relying on this central agent. This has spawned applications in various sphere’s like digital currency, money transfer, smart contracts etc. that will definitely change the way we do business.
reduce algorithms and its family of peripheral components/applications like HDFS (Hadoop Distributed File System), Hive (interpreter that turns sql into MR code), PIG (scripting language which gets turned into MR jobs), Impala (sql queries for data in an HDFS cluster), Sqoop (convert data from traditional relational DB into an HDFS cluster), Flume (injesting data generated from source or external systems to put on the HDFS cluster), HBase (realtime DB built on top of HDFS), Hue (graphical front end to the cluster), Oozie (workflow tool), Mahout (machine learning library) etc.
make sense of this. There are a number of tools available that visualize and provide insights into this data and they inherently use a best fit model that is able to fit to existing data as well as provides predictive value to extrapolations of the causative variables.
solve the following categories of problems using various methods (statistics, computational intelligence, machine learning or traditional symbolic AI) to achieve goals like social intelligence, creativity and general intelligence.
e.g. autonomous driving cars, that use a variety of sensors like cameras and radars to collect information about the road and other vehicles and make real time decisions about control of the car.
with humans for survival? Will this be a symbiotic relationship or a competition for survival? Are we simply a tool in the evolutionary process playing our part in creating a smarter, better and more resilient new being?
