Saturday, December 31, 2011
Did you lose your fancy job title?
How many architects does it take to change a lightbulb? The answer remains unknowable. What we do know is that many job descriptions suffer from title inflation. There are UI architects, Java architects, enterprise architects and so on. Sadly, many of this people have used the word architect in their job title to indicate seniority vs. ability.
As an Enterprise Architect myself, I am a student of simplification and a champion of rationalization. In short, I believe that too many people have the title of Architect and that we need initiatives to bring about industry-wide title rationalization. Simplification of job titles is one step that should be considered by Enterprise Architecture teams as a way of making IT appear less complex to the business. In doing so, the business has a better chance of understanding whom to approach with game changing ideas vs getting caught up in the quagmire of the plethora of titles.
While it is noble to do the right thing for the business, we also have to consider the impact of titles on the people who hold them. Usually, job grades are embedded in titles, and promotions make the new job grade public through a new title. A person’s job grade is generally considered public information. If employees are fairly placed in their job grades and promoted only when they are clearly performing at a new job grade, then salary differences based on job grade are generally perceived to be fair.
The negatives of title simplification tend to add confusion to the marketplace in that the range within a title now happens to be much broader. More importantly, removing titles from the toolbox of tools now means that the focus will need to shift to alternative rewards. Gone are the days where you could get away with giving an adequate pay raise with a snazzy new title.
For employers that are looking to make things simpler, I hope they will take the prudent course of action and acknowledge that titles are bi-directional entities that provide value for all involved parties.
Sunday, December 25, 2011
Liferay Portal: Thoughts on ServiceBuilder
1. Liferay does a good job in making itself manageable, but could take a further step in leveraging JMX to create additional visibility. Generally speaking, when you create a custom service, you may want to expose metrics such as uptime, number of invocations, etc. It should be relatively straightforward for ServiceBuilder to also create a JMX MBean that provides visibility into service operations.
Whenever a service is started, it could automatically register itself with the MBeanServer. Alternatively, you may want your MBeans to be available as soon as you deploy your application, listen for notifications from the deployment service.
2. Many enterprises want to increase the testability of applications they put into production and will naturally gravitate towards tools such as JUnit. Since Liferay creates services that can be exposed via SOAP and JSON, there is an equal opportunity to automate the creation of unit tests for these invocation methods. The ability to test JSON APIs may be best accomplished by using the JUnit extension known as HTTPUnit.
3. Liferay needs a better way to handle subtypes. A supertype/subtype design requires that some attributes be stored in the supertype, and some be stored in the subtype. The attributes for each thing in the real world are split between two tables/entities. This may be useful in a variety of scenarios where you need to store different typed attributes for a given population.
One example that comes to mind is for storing information about different types of users. Imagine a role-based portal where you have agents, employees and consumers all using the same resource. An agent may have a unique industry-provided identifier such as a professional license. An employee may have an employee ID and so on. A service that understands these relationships would be very powerful...
Saturday, December 24, 2011
What exactly is a consultant?
* You work very odd hours.
* You are paid a lot of money to keep your client happy.
* You are paid well but your pimp gets most of the money.
* You spend a majority of your time in a hotel room.
* You charge by the hour but your time can be extended for the right price.
* You are not proud of what you do.
* Creating fantasies for your clients is rewarded.
* It's difficult to have a family.
* You have no job satisfaction.
* If a client beats you up, you get sent to another client.
* You are embarrassed to tell people what you do for a living (people ask you what you do and you can't explain it)
* Your family hardly recognizes you at reunions (at least the reunions you attend).
* Your friends have distanced themselves from you and you're left hanging with only other professionals.
* Your client pays for your hotel room plus your hourly rate.
* Your client always wants to know how much you charge and what they get for the money.
* When you leave to go see a client, you look great, but return looking like hell (compare your appearance on Monday A.M. to Friday P.M.).
* You are rated on your performance in an excruciating ordeal.
* Even though you get paid the big bucks, it's the client who walks away smiling.
* The client always thinks your cut of your billing rate is higher than it actually is, and in turn, expects miracles from you.
* Everyday you wake up and tell yourself you're not going to be doing this stuff for the rest of your life.
Can anyone guess what profession I am describing?
Friday, December 23, 2011
Enterprise Architecture: Ways to make healthcare more affordable (Part Two)
Reduce stress in the workplace: How come HR seems to only focus on employee benefits and not the cause of stress in general? We all know that stress causes cortisone to increase in the body which can impact those with high blood pressure, diabetes and even high cholesterol.
In an informal survey of Enterprise Architects that work for The Hartford, over 60% take some form of medication in the above three categories. Do you think there may be some correlation of the impact of healthcare to the leaders in which they report?
Another informal survey indicated that project managers were having an increase in diabetes. Who else thinks there is a high probability that this could be correlated to the fact that the culture has morphed away from people eating healthy foods in the cafeteria and doing group walks during lunch to one where people grab the quickest thing possible from the vending machine and run to their next meeting?
Provide healthcare at work: How many people have to take time off from work simply to make a doctor's visit? Many people have to take on stress of juggling their calendars just to get an appointment with a doctor for their stress! Why can't we save doctors for major medical concerns and instead provide a nurse onsite for minor ailments?
Even if you work in a building such as the World Trade Center where there are hundreds of distinct tenants, the value proposition of everyone in the building sharing a nurse or two would pay productivity dividends for both employers and employees alike. Seeing a nurse would be cheaper than a doctor and could be done without the arduous process of making an appointment.
Encourage government to provide incentives for teleworker programs: Ever notice how many people use their cars to commute to work in crowded metros such as New York City, Atlanta and Los Angeles? Many Americans have back problems and therefore the prudent course of action would be to reduce scenarios where they don't have the opportunity to obtain an optimal position.
We could of course encourage car manufacturers to put better seats into vehicles but this has a long tail. What if we could instead find ways to avoid sitting in vehicles that don't have lumbar supports whether it is a car, bus or train and instead let people work from home where they could have a better potential for good posture?
Thursday, December 22, 2011
Insurance, Solvency and Enterprise Architecture
Generally speaking, the reinsurance industry introduced the notion of models as a way for private equity and hedge funds to measure the potential risk of loss. The folks who created Solvency requirements seemed to have latched onto the same level of thinking. The challenge as I see it is that carriers are attempting to produce a single model instead of having multiple.
Consider the fact that for facilitative insurance, the re-insurer gets to see the flow as they are afforded and easier and reliable accumulation control where in treaty, at best they only get the information used for underwriting the portfolio at a macro-level. This results in a approach of comparing a scenario where you have high data fidelity with one where their is lower data fidelity.
Another gap as I see it is that the model doesn't drive underwriting and tends to be an after the fact event. Once an underwriter accepts the risk, the reinsurance carrier is committed, then and only then can they incorporate the data entered for the deal into the model. If the goal is to reduce risk, don't you think this process should be inversed?
I also agree that the models fail to account for the differences of property where their tends to be more of a repeatable loss history than on the casualty side.
To argue from the other side of the table, we could also end up with a scenario where the "numbers" and actuarials could over-dominate the human judgement of a skilled underwriter. When we become too numbers focused, we end up with suboptimal results. Look at what has happened with the state of IT for a prime example. Regardless of where you land philosophically, I have been scratching my head attempting to figure out why I have not ran across any online discussions where the practitioners of enterprise architecture have discussed the impact of Solvency on their organization? I cannot think of something that could potentially change how the business works than this. Are all of my insurance EA peers asleep at the wheel?
Monday, December 19, 2011
Social Networking serves to destroy enterprise architecture
Consider the scenario of being an enterprise architect for a social networking site such as Facebook. Is the goal to focus on application rationalization? Is the goal to perform an inventory of all applications in production so that they can be loaded into an application portfolio management tool? I can't think of a better reference model for enterprise architecture and how IT supports the business than Facebook.
Enterprise architecture within many social networking sites is savagely focused on not letting IT become an impediment to business agility. The focus is less about IT strategy in the strict sense and more about getting the fundamentals of IT right. They are constantly finding better ways to design and build software as well as improving how IT infrastructure teams operate.
I find it fascinating how many enterprise architects can't fathom the simplicity of bespoke enterprise architecture! I have observed that Social networking sites have exhibited the following practices in their behavior:
- a. champion modest code bases
- b. Architectures must be extensible vs all possible functionality designed upfront
- c. Eschew monolithic practices such as writing comprehensive business requirements documents and instead focus on ways to collect feedback from users of the system
- d. Design the ecosystem for growth and do whatever it takes to mask inefficiencies
- e. Don't establish an organization chart that discourages technical people from remaining technical. In fact, encourage the existence of savage teams of nomad developers and encourage them to travel in tribes
- f. Get all the useless processes that don't help people do their job out of the way
So, if you want to be agile, align with the business or other buzzwords of the day, you need to pay more attention to how it is done in social networking ecosystems...
Friday, December 16, 2011
Enterprise Architecture: Ways to make healthcare more affordable...
Why am I not surprised to note that no one has studied the supply chain of healthcare? Today, I will share a few ideas that each are guaranteed to save billions...
1. Optimize the pharmacy process: Consider the simple fact that many people have diseases such as diabetes that are long term in nature. People with diabetes typically take the same drug for pretty much all of their life. When does it ever make sense for you to get a thirty day prescription for something you will have for at least the next thirty years? Think about the overhead checking with a pharmacist so frequently costs? Some have extended this out to ninety days but this too misses the point. How much could we save if we could get all our drugs in one shot as an annual event?
2. Change how drugs are dispensed: I frequently purchase prescription medication for family members whenever I travel outside of the United States. My last purchase was in Sangre Grande, Trinidad where I simply handed the pharmacist my money and he handed me a sealed jar. Don't you think this process is much safer than what you witness in your local CVS where the pharmacist is doing extreme and potentially error-prone multitasking in talking to customers, moving pills from one jar to another, disputing with health insurers all at the same time?
3. Make doctor fees transparent: Healthcare is one of the few industries where a consumer learns what it costs only after he has used the service. It doesn't have to be this way. What prevents a health insurer such as Aetna, Cigna or United Healthcare from creating a mobile application that allows a consumer to see list pricing for a given service along with what they could be expected to pay for a procedure out of their own pocket? Imagine a mobile application that could provide a predictive diagnosis and then tell the consumer how much that doctor visit may actually cost upfront?
4. Encourage more volunteerism: I live in a town with a volunteer ambulance system which is much cheaper to run than a for-profit entity. Why can't more municipalities leverage the finer aspects of volunteerism in this regard? Imagine a scenario where a nurse attended a state school such as University of Connecticut and has amassed some debt. Couldn't he/she work off this debt through volunteerism?
5. Make more drugs available over the counter: I, like many other IT employees work in a stress-filled environment and have developed high blood pressure. I take my daily dose of Atenolol. What would happen if they made this a drug that is available behind the counter where you needed to ask for it? It is not a drug that even the most deranged would be interested in taking. It doesn't make you feel good, doesn't introduce any form of psychosis and doesn't even taste good. Why involve the doctor to write a formal prescription, a pharmacist to fill the prescription, etc. Can't we leave these well-trained medical practitioners to do something of higher importance?
Wednesday, December 14, 2011
Leveraging the worst developer to improve quality of enterprise applications
Before we get started, I must disclaim the fact that my idea will not work if the challenge of bad developers occurs in scenarios where consultancies such as Accenture have backed up the school bus to your enterprise and sent you more than one or two. That particular challenge is left for another blog entry.
A possible approach is to create a new role I will label as Project Saboteur which will be filled by your suboptimal developer. Their job is to check out a parallel copy of the source tree, and then introduce bugs on purpose. If you have a service that is supposed to return ten values, you deliberately hack the code to return eleven. In theory, every bug they introduce should be caught by an existing test. If not, you've found a hole. Categorize the holes, and you may find systematic weaknesses in the tests. Count the holes as a ratio of overall bugs introduced and you have a very rough idea of the percentage of bugs your tests are finding.
Since we know that this individual can't write quality code, we should leverage them in ways where we need to not write quality code. The art of defect seeding is something that can also test other enterprise processes such as whether the code review process is beneficial or strictly done to appease the PMO organization and is ceremonial in nature.
Defect seeding can be a better motivator for reviewers to more thoroughly seek out and uncover defects. Instead of it being a burden on the team to leverage an incompetent developer, it is better to turn it into a challenge where everyone is incentived to find many of the known/seeded defects. On Agile teams, this tends to add more synergy and zeal to inspection efforts...
Tuesday, December 13, 2011
What does a CIO do all day?
If you listen to the vendor-controlled media, you would think the only thing they do is listen to thinly veiled PowerPoint pitches from consulting firms and industry analysts sprinkled with a few software purchases. There is more to the job of being CIO than simply best practices in management by magazine.
Is the job of the CIO to push back? When does this mean being an impediment to embracing advances in technology? The CIO must insure new technology will really make things things more efficient. Imagine a world where if everything sold by software vendors were true. You would have achieved a combined 1m % ROI and could run an entire IT department for a large corporation with just one person working part-time but we all know that this simply isn't true.
I am frustrated with vendors and their pitches to save money. Just because something saves money doesn’t mean you buy a new one each year. Imagine a car with greater gas than your current car. If you bought that new car for an extra $5,000.00, you must drive it enough so you actually get the $5,000.00 savings. If you replace it yearly, you never get enough savings to justify the purchase.
So at some level, I think the job of the CIO is to exercise fiscal prudence especially in shops where Enterprise Architects are too busy being conned to pay attention to the next wave of technology. After all, in order for IT to remain viable, it must become business aligned and sadly, most enterprise architects are asleep at the wheel in this regard.
Thursday, December 08, 2011
Information Security Control Worst Practices
I think the individuals asking questions regarding policies and controls are sincere. Likewise, I think industry analysts provide answers to questions asked, but never take the next step to figure out if their audience is truly asking the right questions in the first place. There are general worst practices I have noticed:
1. Most controls cannot be implemented: This occurs for a variety of reasons ranging from the state of current technology to the simple fact that many of them get in the way of the business wanting to do business. How many corporations have a ridiculous policy on mobile device usage? How many of these same corporations have actively funded enterprise projects that run counter to them? There are numerous other examples of this type of idiocy. Information security professionals need to get unimplementable "controls" off the book.
2. Most controls cannot scale: The best example of this scenario is how PCI/DSS came up with their requirements. For example, they did a great thing in requiring credit card merchants to take certain steps for their web-based applications. Did you know that the majority of lost credit cards have occurred via websites that are vulnerable to SQL Injection and Cross-Site Scripting? PCI put in requirements that a code review should occur by someone independent from the developer team. Do you think PCI requires its auditors to know anything about software development?
3. Most controls actually ignore sound risk management practices: Most controls use Boolean logic to ascertain whether you comply or not. The reality of security is more nuanced. Imagine the scenario where you did an audit of two enterprise applications and discovered that one has over one-million OWASP Top Ten vulnerabilities but has only two users, is used only one day a year and can be shutdown when not in use. The other application has only one vulnerability but is used by thousands of employees and is Internet facing. Which one will the auditor get their jolly's off on?
4. Most controls are tightly coupled to how a product is implemented: Walk into a corporation and read their controls on passwords. You will note that many of them somehow magically align to how Active Directory, RACF, etc are configured. Now ask yourself, what happens to the control if you don't use passwords in any form but there is otherwise a vulnerability?
5. Most controls are infrastructure-centric and ignore the concept of assets: Is there anyone who thinks the Federal government will shrink its budget? Is there anyone who thinks that the information security department in their own organization doesn't work like the Federal Government? Ever look at how the business invests money? You may discover that they spend more money on applications than they spend on infrastructure, so wouldn't it make more sense to figure out how to secure the applications over securing the infrastructure? Consider the trend of the business moving their enterprise applications to the cloud. Are the controls and ultimately the solutions that implement the controls portable to the cloud or are they only targeted at the internal infrastructure which is increasingly becoming less critical...
Tuesday, December 06, 2011
Enterprise Architecture: Where have all the developers gone?
When I started my illustrious IT career in high school in the 80s in working for Cigna as part of their Application Field Services division, pretty much everyone in IT knew how to sling code. Nowadays, you are lucky if you could find an enterprise IT shop where 25% of its inhabitants know how to code in a modern language.
Many within the Enterprise Architecture community can wax poetic about the need for business and IT alignment. Few, however have realized that prior actions in the spirit of aligning may have actually moved us further away.
I remember when business customers used to roam the corridors of IT where they would strike up a conversation on whatever they were noodling. They could intellectually test their thought processes against pretty much the first person they found. Nowadays, they are lucky if they get to do this at all, and if they do it is more than likely mired in formalism that takes several weeks to accomplish.
We have outsourced all the developers to different countries in different timezones and now the enterprise architecture team is the last bastion of hope. Is it better for them to focus on meta-issues such as whether to leverage Zachmann vs TOGAF or should they instead figure out how to bring back genuine conversations with the business?
In the age of turning everyone into a plug-compatible human resource, we have managed to make things less human. It used to be very easy for a business person to identify someone in IT. Why did we make it so much harder for our customers to recognize us?
To find a real developer, you must go on a pilgrimage to a dark corner with a strange blue glow, following the trail of Twinkies and Little Debbies and Red Bull can's... You may ask the developer there if he is, but he/she probably won't speak in a language you understand. If the response is incomprehensible, you've found what you're looking for.
Does the business really want better PowerPoint that uses the corporate template when communicating with them or would they prefer a genuine conversation that is thoughtful and timely even if they don't understand it all?
Monday, December 05, 2011
Why "Agile" has only had a modicum of success in insurance
1. The majority of
Let's acknowledge that Agile has been wildly successful in companies and cultures where the engineering mindset is much more valued than the project management mindset.
2. The focus on documentation: Can you think of an industry vertical that outranks an insurance carrier in terms of sending confusing documentation to their buyers? They have a habit of sending 100 page documents such as policies to 90-year old grandma's and couldn't care less about whether it is understandable by the receiving party. So, what makes us think that writing understandable documentation is going to happen?
We can all get caught in the vortex of asking for good documentation, but this is a trap. There is nothing in any Agile methodology that can overcome this challenge. I can tell you from my experience, I'd love a proper requirements document - but I've yet to see one. Every requirements document I see is loaded with assumptions on both sides, and rarely do both sides agree on the assumptions. I had one experience where the customer agreed that one "feature" we were developing for them was totally not what they wanted, but they were afraid to alter the requirements document because they were afraid we'd negotiate out of some features they wanted. So we went merrily along developing something that they would not use. Doesn't think feel like your insurance policy?
When insurance carriers adopt the policy of brevity, then they may stand a chance.
3. IT doesn't know who the customer is?: Having an on-site customer representative is risky, because he/she is bound to be pretty junior. Is the customer really going to spare a senior decision-maker for an entire year? But regardless, the customer representative "becomes" the formalized requirements specifier.
Now, combine this thought with the acknowledgment that many insurance policy administration systems have more tenure than even the most tenured IT employee. Insurance business logic is a collection of exceptions that have been built up over time and has reached the point that no individual even knows them all. Usually, they have been lost in the documentation archives and survive only in code. Therefore, the person that knows how the business works the best in its current state tends to be some lowly IT employee.
The practice I think is best employed is to sometimes go customerless and to encourage IT to adopt the principle that before they write code, they need to learn how to read code.
Saturday, December 03, 2011
Is forcing developers to work in cubicles a worst practice?
If you think about the role of Architects and Developers unlike other roles, you will quickly see that much of their work is a combination of art and science. Project Management on the other hand tends to be more scientific in its approach. That is if you ignore the point in time where project managers indicate on their status reports that their project is green. On a six month project, they are 90% done and nine months more to go. You know what I am talking about.
Anyway, Programming is an intense activity that requires extended periods of quietness and concentration. Cubicle environments are noisy and distracting. Programming often involves brainstorming (whiteboards are historically one of the tools of choice). Cubicles do not support large whiteboards. Going from a remotely placed whiteboard to the cubicle workstation requires copying the contents of the whiteboard onto paper. This is time consuming.
Programming often involves reading books. Books are best read with directed and controlled lighting (reading lamp, natural light over the shoulder, etc). Cubicles offer overhead indirect lighting shared by all. Books are also best read in comfortable chairs. Cubicle chairs are ergonomically designed for workstations and activities such as typing on a keyboard and looking at a monitor. Even the best of the architects and developers I know, at best only spend 25% of their time typing. So, at some level, cubicles de-optimize the majority of the activities spent done by architects and developers while optimizing the minority of their time.
Artists do not work in cubicles, they work in studios. They need space and natural light. They need a muse. Scientists also do not work in cubicles, they work in laboratories. They need equipment and whiteboards. They need inspiration. If we as IT professionals continue to deliver projects late and of suboptimal quality, then how come no one hasn't put on their thinking hat and figured out that we don't need more methodologies but simply a change of environment?
In a culture where your CIO is craving innovation, isn't it ironic that he/she puts their people in cubicles and then tells them to think outside the box? One trend that deserves further analysis is the teleworker movement. Many corporations in fear of liability surrounding workers compensation claims are mandating that teleworkers establish equitable work environments. While your walls at home may not be covered in fabric, at some level the policy is encouraging one to still work within a box.
So, you can either work in a box at work or work in a box at home. Why are there only two choices? If you have ever visited a large enterprise and truly cared to find the place where productivity is at its highest, may I suggest you visit the cafeteria outside of lunch hours? You will see many having the space they require, convenient access to life's necessities and most importantly the ability to have an open conversation at human tone without disturbing others...