Thursday, November 29, 2007

Collaborative Content Culture: Part 2 - Users Will Find a Way

This is a continuation from this post.

Recent changes in governance and oversight engendered corporate cultures that place a premium on the ability to audit and manage resources. The policies inherent can appear arbitrary and confusing to business users as IT resources struggle to work within newer workflows designed to support these requirements. Often there is a disconnect between functionality that users can enjoy in their private lives and what they must accept at work. Tools that are easily downloaded and customized to empower families and friends to share and collaborate via real time voice chat, digital photo albums or whiteboard utilities may not be sanctioned corporately giving rise to frustration when a user’s parents or her husband can appear to be less technically savvy and yet share images cross-continent so that mere minutes after Mom has taken pics of Little Johnny, the child’s grandparents can order prints from the local Wal-Mart or Costco and yet the workplace IT staff appear incapable of understanding or caring that tools exist that would enhance users’ ability to share data and functionality.

Part of the problem is education. Business users have spent years building intellectual property and to the degree that they can be educated about the dangers of digitalized data and the need to properly vet new technologies and tools from numerous angles and with varied skillsets, they will probably be more amenable and accepting of delays or outright refusal to allow them to install tools like Twitter or frequent social sites such as with corporate resources.

Another part of the problem is IT's focus. Perhaps IT staffers are people who enjoy less personal interaction and derive some amount of satisfaction from wielding control such as how we can control computers via programs or control networks with routing tables. For many programmers interfacing with business users can be complicated, frustrating and confusing. Who doesn't have some story about how obfuscated business processes become and how difficult it is to codify what it is that the business users want? If it's the reader's experience that IT personnel like to exert control then the leap may not be so illogical to understand that as corporate standards shifted from "figure out what the users need and make it" to "protect our digital assets and here's Microsoft's latest security tools to help you", we may have dropped the ball with focusing on creating workflows and audit trails and authority domains without making giving consideration to how negatively these changes could impact our users' collective experience with technology. Anyone that reads blogs can see that Project Management and Business Process Modeling are buzzwords and have been but in this author's experience, as restrictions are applied to help protect assets often its to the detriment of serving business users' empowerment and support as teams who come to wield power over access or who implement new security protocols may make good efforts to educate or delineate those processes but sometimes those communications fall short of their intended audience and the ones that suffer are the business users who have to accept ever longer delays in problem resolution or approval and installment of newer, enabling technologies.

I understand some changes are "in the corporate interest" and while there can be no doubt that the corporation, its officers and the IT staff as the keepers of the data and "maintain the borders" bear a responsibility to protect corporate assets this post is focused on employees “finding a way” and, as Jeff Mann pointed out, in numerous organizations users already have found ways to defeat corporate policies whether illicitly or through ignorance.

Considering the dichotomy of empowerment for which technology is praised and the culture of technological restriction that has only recently seen strong growth in the corporate, Microsoft world it’s no wonder that business users may feel entitled to use whatever they can regardless of corporate standards. Perhaps users think “if I can install it and it runs then it must be allowed” or “I’ll use it ‘til they tell me to turn it off” (this author has never used either maixm).

While corporate IT staffs must respond to potential threats to the company’s intellectual property, we have invested years in fostering the attitude that technology is designed to enable users and now we should devote attention to tools that help us protect our companies but those tools must make allowances for collaborative content for if we don’t we’ll find that users may overwhelm our resources with untested ‘Net-enabled applications out of a genuine desire to enhance the value of their collaborative work. Disregarding the shift away from the empowerment and enabling facets of technology to the more moderate position of validating users prior to empowering them would be a disservice to our users. We need to:

  • Acknowledge this difference in IT focus
  • Educate the users to what we're changing in our processes and why
  • Enhancing their functionality and collaboration
  • Apply validating security that protects our digital assets

Wednesday, November 21, 2007

Collaborative Content Culture: Part 1 - Resistance

I attended a recent webcast with Jeff Mann of Gartner Group and Cheryl McKinnon of Open Text. The topic was enterprise functionality in terms of content and collaboration.

Jeff is a superb speaker with obvious ease in this topic domain; he spoke with an ease based on a depth of expertise.

Several facets of content and collaboration were touched on during the hour-long webcast so it should be obvious that none of these topics were drilled into with any depth but they become a great outline for future posts.

  1. How Collaboration and Community-Building May Give Rise to Insecurity and Resistance
  2. Users Will Find a Way
  3. Social Networking: What's Valuable in the Fortune 500 Corporation?

1. Insecurity & Resistance to Collaboration

Today in the US Technology business it is imperative that IT individuals and the teams they comprise cooperate across responsibility boundaries. Coordinating with business goals has always been a burden with which IT labored but in the era that begun in earnest five years ago or so it became increasingly important that IT teams become segmented in their responsibilities and capabilities. With the passing of Sarbanes-Oxley security, audit, governance and oversight became watchwords that only intensified with off-shoring of IT assets. To maintain previous levels of efficiency IT professionals started finding that rights they had taken for granted were progressively removed to enable corporate officers who would be held responsible able to track resource access and validly state that they maintained strict control over corporate governance. A decade ago, if a developer needed a development version of a database, he would copy the Production data and put it on his laptop or a shared development server. This placed the responsibility on the developer but allowed him to be very responsive to the business owners of the project he was working on. In today's corporate environment, a developer must submit a request to someone with the requisite rights to servers to have a development server/database assigned to him and in some cases submit a subsequent ticket to have the production data retrieved, scrubbed and put onto the development database. Each request usually requires numerous sign-offs and approvals before it can be completed. The governance controls established in this environment means that IT resources must become much more efficient and proactive to continue to be as responsive to their business partners.

It doesn't take much creativity to see that as these types of controls are overlaid across the whole enterprise business users and IT resources alike must raise the level of their professionalism, efficiency and proactivity to keep similar historical project timelines.

Using technology that is already corporate-sanctioned such as e-mail or internal company instant messaging is one way that collaboration is enabled but often times business and IT resources that find themselves under deadlines may feel greater pressure to utilize untested collaboration methodologies due to the overhead of getting corporate approval and buy-in on newer technologies. Sometimes the obstacle to adoption of new methods (citing Jeff here) is a manager's personal insecurity that could stem from fearts that if his direct reports build a higher level of community with their business resources he may either be marginalized, may not have information required by business owners and project drivers or he may find himself held accountable for decisions made that are not in compliance with corporate standards. It was to this level of personal insecurity and resistance that Jeff directed his comments when he stated that:

"workplace failure in the last few years has moved from technical issues to
failure of teams to incorporate or utilize recently enabled functionality
because it may seem to threaten someone's or some team's territory."

Saturday, November 03, 2007

MSDN content

Sometimes I forget how nice it is to have an MSDN account provided to me; then comes a weekend like this one. As I re-engage my tech career and work on rounding out my skills and exploring more Content Management Systems (CMS), prepare for my certification and drive deeper into the whole Information Lifecycle Management (ILM), I downloaded something 30 Gb of MS servers, tools and training. I've burned numerous CD's & DVD's and build a few virtual PC's today.

MS does make integration ridiculously easy and having access to all levels of these tools & servers (Standard to Professional to Enterprise, etc) makes creating multiple versions and customizations handy.

Friday, November 02, 2007


Virtualization is either a buzzword you use at work or its about to be (link to another article). At the end of the last century a small company, VMWare, started producing a software package, also named VMWare, that allowed you to use a single server to act like multiple servers. This may seem like a ridiculous concept because if you are used to how slowly your computer runs it seems counterintuitive that you'd gain anything from making your computer work for two, three or four people all at the same time. The catch, however, is that with servers the full processing cycles are not in constant use. Server computers tend to have a higher caliber processor and sometimes they have multiple processors, more robust internal structure and faster drives. Being able to divide all the processing power of those machines between multiple programs or teams simply means a more full usage of those resources as well as less administrative overhead to support the same number of teams (four development teams using a single piece of hardware as oppossed to each having their own hardware). When considering servers you can often hear terms such as load, load-balancing, fail-over, utilization. These terms refer to how much a server is used and how it can pass along some of its work to other servers if it gets too busy.

Virtualizing multiple instances of the OS on a server means better resource utilization (as long as the server can handle the load). Companies like VMWare (purchased by EMC2), Microsoft (producers of Virtual PC) and SWsoft (makers of Virtuozzo) make software that allow users to create virtual servers that are hosted on a single server.

This technology is relatively young but still tried and a solid performer for businesses. Virtualizing servers has a positive payback for most implementations.

Newer on the scene, however, is the concept of storage virtualization. This concept seems so recent that there is little evidence that it resolves as many issues as it introduces. Some of the benefits are freedom from resource management at the user level; users don't have to remember that the H: drive has 120Gb and the J: drive only has 4Mb available. With storage virtualization a wrapper overlays the storage layer and makes it addressable to the client as a single entity. Users theoretically just connect to the virtual storage repository and their data is stored for them by the wrapper. Another benefit of storage virtualization is that files gain some resiliency to hardware failure and hardware appears to be hot-swappable beneath the virtualized wrapper.

Virtualized storage is different from SANs, NAS and, although it shares some characteristics with CAS, it is a different creature from CAS as well.

Network World "Vendor claims about storage virtualization flawed"
Enterprise Systems "Next-Generation Storage: Think Virtual (2005)"

Thursday, November 01, 2007

Predictive Analytics and Content

Content management incorporates many prinicples into a paradigm. Any implementation of these principles may be more or less cohesive depending on how well the business processes are understood and how well the underlying functionalities are customized to support and extend these business processes.

A relatively recent benefit of content management is predictive analysis. Predictive analysis is an example of impelementing business intelligence into an information lifecycle. This particular BI hinges on correlating seeming disconnected pieces of data using heuristics to produce a probability matrix of knowledge entities from a vast respository where those entities are most likely to fit some criteria. An example is that Infinity Property and Casualty, an automobile insurance company, recently selected SPSS, Inc's Predictive Claims (tm) software to streamline their claims process (link). In this instance the company suggests that predictive analysis will allow them to better assign claims to adjusters best-suited to handle each particular claim with a side benefit of helping their company detect fraud more efficiently.

I believe that most people probably don't tend to think of their insurers as penny-pinching fat cats who eschew handing out a dime but if you are one of the minority that fits into that description here's something that might help. Handling claims costs money and while there are some instances of egregious behavior through the insurance universe, there are also myriads of stories of claims reps in any insurance industry that make heartfelt, sincere efforts to help their claimants throughout the claims process and get the insured customers' funds out to the customers in a timely manner. Unfortunately, just like there may sometimes be a person or process or unspoken policy that hampers even the best-intentioned claims rep, sometimes there are unscrupulous individuals who might fake an injury or have an estimate given by a "friend" to defraud the insurance provider. Predictive analytics is a hedge for the insurance industry against just such fraud. This helps you and me (assuming you're not one of the people perpetrating fraud). Believe it or not, in the same manner that an actuarial can often accurately predict who is most likely to have a wreck or get injured (which is how insurance companies determine what your rates are for your insurance), predictive analytics can very accurately narrow the pool of potential fraud perpetrators so that the limited investigative teams of insurance adjusters can spend their limited time checking out the other guy who claimed his late-model Lamborghini got a scratch at the mini-mart so that our car insurance carrier doesn't waste too much time having an adjuster come out to your house to see how your truck got totaled by the dude rushing to get to work last week.

Sometimes it can be annoying to think that anyone has this much information and the ability to use it to predict things but its important to keep two things in mind about predictive analytics:
1. The only cases I've heard of are where predictive analytics is used to create a matrix of people for someone to check into; no one is ever convicted based on predictive analytics. These tools are used to better assign resources.
2. Just because the probability that you will do X is higher than the probability that I'll do X does not translate into the fact that you'll do X before I will. People have a capacity for creativity and while how someone operates based on their upbringing, environment and heredity is for another discussion, it is eminently clear that content systems do not contain all possible permutations of events that might contribute to a decision that you make so a million computers running predictive analytics could analyze volumes of data about you and they could all reach the same conclusion that you're going to be in a car wreck next Tuesday and yet, through the sheer carpricious nature of chance you might not be in a wreck. Just because something is predicted, even based on these impressive predictive tools, does not mean that it has to happen.

Hope this gives you some insight into predictive analytics and content management from the 20,000 foot view.

For a more educated view of predictive analysis, see the 24-page White Paper produced by that AICPCU/IIU - link.