Thursday, July 29, 2010

Beyond SAS 70

By R. Edwin Pearce (www.epearce@egisticsinc.com)

A new study from Gartner confirms something that eGistics (www.egisticsinc.com) has known for some time: there's a lot more to effective security, privacy and continuity than compliance with Statement on Auditing Standards (SAS) 70.

"SAS 70 is basically an expensive auditing process to support compliance with financial reporting rules like the Sarbanes-Oxley Act (SOX)," says French Caldwell, research vice president at Gartner. "Chief information security officers (CISOs), compliance and risk managers, vendor managers, procurement professionals, and others involved in the purchase or sale of IT services and software need to recognize that SAS 70 is not a security, continuity or privacy compliance standard."

Published by the American Institute of Certified Public Accountants (AICPA), SAS 70 provides a service provider's auditor with guidance on how it should report on process-related risks relevant to financial statements and transaction processing. Intended for use by the customer's auditor, the result of a SAS 70 is either a Type I attestation that the processes as documented are sufficient to meet specific control objectives, or a Type II attestation, which additionally includes an on-site evaluation to determine whether the processes and controls actually function as anticipated.

Gartner believes a SAS 70 Type II evaluation does provide a very high degree of assurance that the examined controls are effective. The performance of controls is evaluated over a period of time; it is not just a snapshot of control effectiveness. However, customers should never assume that the provider has implemented all the appropriate controls, Gartner says.

"To ensure that vendor controls are effective for security, privacy compliance and vendor risk management, SAS 70 ... and other national audit standard equivalents should be supplemented with self-assessments and agreed-upon audit procedures," Caldwell explains.

Interested in learning more? E-mail me at epearce@egisticsinc.com.

Tuesday, July 20, 2010

Cloudy with a chance of Microsoft


Microsoft CEO Steve Ballmer, known for his eyebrow raising antics at company-wide employee meetings, is raising eyebrows again with his provocative and far-reaching statements about Microsoft and the cloud. On July 12 Ballmer told 9,500 attendees at the annual partners’ conference that “if you don’t want to move to the cloud, we’re not your folks.” The cloud, he says, is “inevitable.” Whew.

Actually, even in the summer of 2008 Microsoft recognized that on-line delivery of critical business applications and services was, in fact, “a sea change” in the way businesses and corporations want to be served. On-line delivery was then and is now recognized as part of a “services wave” that is causing some to criticize the traditional software-based delivery model and on-premise execution of business applications as growing “antiquated.”

Maybe yes, maybe no.

Ballmer acknowledges what corporations have been concerned about since the cloud began to form: security and compliance. He implies that companies that get this right are “way ahead” in providing a viable offering to the market.

This brings up a good point in the use and selection of on-line services companies: Choosing one that provides an on-line service is one thing; choosing one that has invested the time, cost, expertise and infrastructure required to provide world-class security, and that supports a variety of compliance mandates, is quite another.

It is our experience that large, security- and compliance-conscious institutions are taking advantage of the growing maturity of cloud services, especially in the area of the management of documents, transactional data, payment images, and reports. As institutions become more comfortable with, and confident in, selective cloud providers, expectations will increase regarding the use of such information for fraud detection and prevention, data mining, analysis, legal discovery, research and customer service.

Is your company catching the wave, dipping its toes in the water, or staying high and dry? 

Tuesday, July 13, 2010

Trends in ACH Dispute Management

Trends in ACH Dispute Management
Thursday, August 12 at 1 p.m. eastern

As ACH volumes have grown, so too have the number of ACH transaction disputes that processors must manage. Expensive to handle, these disputes are subject to a complex mix of rules and regulations, and can lead to hefty charge-offs if improperly managed. Just how big a problem are ACH disputes? This webinar will share the results of an exclusive survey of ACH processors on trends in ACH dispute management, including volumes, costs, levels of automation, future plans and more. Attendees will be able to benchmark their operations, gain actionable insights from our panelists, and learn what some processors are doing to automate their ACH dispute processing.

To register, click this link https://www1.gotomeeting.com/register/589842392 or e-mail Dave Nitchman of IAPP-TAWPI at dnitchman@tawpi.org.

Panelists:
Rossana Salaris, principal, Radix Consulting
Amer Khan, senior vice president, product and sales support, eGistics

Moderator:
Mark Brousseau, facilitator, IAPP-TAWPI Payments and Receivables Council

Monday, July 12, 2010

Economic risks of data overload

By Ed Pearce (epearce@egisticsinc.com) of eGistics (http://www.egisticsinc.com/)

When data pours in by the millisecond and the mountain of information builds continuously, professionals inevitably cut corners and go with their 'gut' when making decisions that can impact financial markets, medical treatments or any number of time sensitive matters, according to a new study from Thomson Reuters. The study indicates that when faced with unsorted, unverified "raw" data, 60 percent of decision-makers will make "intuitive" decisions that can lead to poor outcomes.

Many government regulators have flagged increased financial risk-taking, which can be traced in some degree to imperfectly managed data, as a contributor to the recent financial crisis. Moreover, the world is awash with data -- roughly 800 exabytes -- and the velocity of information is increasing, Thomson Reuters says.

The challenge is that the staffing and investment needed to ensure that information and information channels are trusted, reliable and useful is not keeping pace. In fact, it is estimated that the information universe will increase by a factor of 44; the number of managed files by a factor of 67; storage by a factor of 30 but staffing and investment in careful management by a factor of 1.4.

"The solution to data overload is to provide decision makers with what Thomson Reuters calls Intelligent Information: better organized and structured information, rapidly conveyed to the users preferred device," says David Craig, executive vice president and chief strategy officer.

Fortunately, as the Thomson Reuters study notes, the same technological revolution that has resulted in the explosion of information also opens the way to new and improved tools for providing intelligent information: better organized and structured information, rapidly conveyed to the user's preferred device.

"We must use the benefits of the information technology revolution to minimize its risks. This is a joint task that the private sector and governments must closely focus on if we are to avoid systemic crises, in the future, whether we speak of finance, healthcare delivery, international security and a myriad of other areas," comments Craig.

How is your organization managing information overload?

Saturday, July 10, 2010

Same-day ACH settlement highlights need for better dispute management tools

By Ed Pearce (epearce@egisticsinc.com)

Last week's announcement by the Federal Reserve Board of posting rules for a new same-day automated clearing house (ACH) service brought the topic front and center. Everyone from industry analysts and bloggers to trade publications and associations have expounded the pros and cons of same-day settlement. But virtually unmentioned in the all the hubbub is the potential for more ACH disputes as a result of accelerated settlement -- a scenario most banks are ill-prepared to manage.

Starting next month, the Federal Reserve Banks will be offering a same-day settlement service for certain ACH debit payments through its FedACH service. FedACH customers may opt-in to the service by completing a participation agreement. The service will be limited to transactions arising from consumer checks converted to ACH and consumer debit transfers initiated over the Internet and phone. Same-day forward debit transfers will post to a financial institution's Federal Reserve account at 5 p.m. eastern time, while same-day return debit transfers will post at 5:30 p.m.

As a result of the faster settlement, banks undoubtedly will see more consumers coming into their branches complaining of unauthorized transactions. The limitations of traditional in-house ACH systems and the strict time constraints and complex processing requirements imposed by NACHA rules and Regulation E already have led to sharp increases in operations expenses and higher charge-offs associated with ACH disputes. A new influx of consumer disputes will require financial institutions to implement a more centralized, more streamlined approach to dispute management.

Several features will be critical:
  • Real-time distributed data access to any authorized user, anywhere
  • Intuitive search capabilities
  • The ability to annotate comments to disputed transactions
  • The ability to export data
  • Expanded search capabilities
  • Filtering capabilities to block and restrict access to certain transactions
  • Unlimited data storage
It may be some time before same-day ACH settlement achieves critical mass. But the next generation of consumers will demand it. This means that banks must begin adapting their ACH infrastructure today or risk even higher operations costs, as well as falling behind the competition. And this includes deploying sophisticated solutions to manage the inevitable spike in ACH disputes.

Friday, July 9, 2010

No pennies from heaven: controlling technology costs

GEORGE BAILEY: You don’t happen to have eight thousand bucks on you?
CLARENCE: Oh, no, no. We don’t use money in heaven.
GEORGE BAILEY: Oh, that’s right. I keep forgetting.
Comes in pretty handy down here, bub.
IT’S A WONDERFUL LIFE


By Randy Davis (rdavis@egisticsinc.com)

While lost on Clarence the angel, the need for capital is obvious “down here.” It you are not raising capital, you are preserving it. Whatever is preserved can help manage cash flow or can be used for other purposes. Cost control is one means of preserving capital.

To my way of thinking “cost control” requires at least the following three things:
  • Keeping total cost of ownership as low as possible (including maintenance and upgrades)
  • Paying for something only once if possible
  • Keeping costs predictable yet variable based on current factors
It is difficult if not impossible to achieve 2 and 3 above with traditional software/hardware or do-it-yourself cost models. Hardware and software require on-going maintenance, upgrade and replacement costs. Although maintenance costs may be predictable, they are in fact a perpetual payment protecting the use of hardware and software. Upgrades and replacement costs are not entirely predictable partly because there are simply too many external factors (end of life, advances in technology, merger/acquisition or divestiture, etc.) controlling the timing of their expenditure. As things stand now, technology approaches obsolescence every three years.

(If you don't think this is a hot -- even emotional -- issue, see my blog below, "Putting the kibosh on the soaring software maintenance and upgrade costs.")

As an example of cost requirements, here are the findings of a Global Concepts study on the cost breakdown of an in-house digital archive:
  • Explicit base costs (such as servers, communications, disk storage, long-term storage, software, maintenance) are only half the total cost 
  • Staffing adds another 40% on top of the base costs 
  • Replication (disaster recovery/business continuity) requires an additional 36% above base costs 
  • Implementation and Development adds another 24% to base costs
Once you have added all your costs together for a secure and reliable in-house archive, the total cost percentages break down as follows:


Rather than a solution that requires fixed and sunk costs, what would be helpful is a solution with an entirely predictable “pay once” fee structure. A “pay once” fee structure should be the simplest, most predictable and controllable fee structure you can get from any of your vendors – partly because of what is avoided, namely on-going costs for users, hardware, software, and maintenance.

The critical difference between using such a fee-based solution and any other cost structure, is that while all other cost structures require on-going, perpetual costs to keep things going, “pay once” does not. You avoid paying – every day for the life of the solution – for the privilege of using that solution. More importantly, you don't pay for excess capacity that either is waiting to accommodate future requirements or becomes unused because of a contraction in business.

I'm certainly not naive enough to think that a "pay once" fee structure is applicable to all, or even most, hardware, software or services. However, it appears that more and more businesses are demanding such a choice.

What is your experience with fee-based solutions?

Thursday, July 8, 2010

Pressure is rising for data center managers

By R. Edwin Pearce (epearce@egisticsinc.com)

Just when data center and IT managers assumed things couldn’t get any worse, along comes a report from Gartner predicting that the critical issues facing data centers – namely, technology, space and energy challenges – will worsen in 2010. Coupled with the tremendous cost pressures brought on by the economic downturn, the Gartner report provides a heightened sense of urgency for data center and IT managers looking for pragmatic ways in which to deal with their operations issues. In its report, Gartner provides several tips for helping reduce data center costs:
  • Eliminate those systems that are underutilized or old
  • Consolidate multiple sites
  • Better manage energy and facilities costs
  • Better manage people costs
  • Delay the procurement of new assets
To be sure, these are all sound strategies. But savvy data center managers already have implemented (or at least considered) these strategies in response to the economic downturn. In other words, many data centers may have already squeezed as much savings as possible from their infrastructure.

However, the following challenges still remain:
  • Finding ways to implement a variable expense model to take advantage of reductions or slowdowns
  • Improving security and regulatory compliance
  • Staffing to support multiple hardware/software environments
  • Reducing excess capacity while maintaining the ability to grow
  • Managing the increasing need for backups and redundancy
None of this is new, of course. These are the normal and continuing requirements for doing business, perhaps "heightened," as Gartner says, by the subdued economy. 

How are you responding to your own data center information management challenges? If you had your way, what would you have your company do differently than they are doing today?

ACH Dispute Management Survey

NOTE: The survey is closed. Tune in to the Results Webinar at 1pm EDT August 12 for a discussion of the results and insights into ACH processing and dispute management practices.


Sign up for the Webinar here: 





Through July 15 eGistics is conducting a survey of ACH processing and dispute management practices. If you have specific knowledge of ACH processing, we encourage you to contribute to the survey. The survey should take 5 - 10 minutes to complete.


All known participants will receive survey results once the survey is completed, and an invitation to access a special Webinar that will present and discuss the results.


We believe that you will find the survey results helpful in better understanding how ACH processing and dispute management are being practiced across the industry, and the various challenges ACH processors are facing.


To protect your privacy, the survey link below enables you to take the survey without revealing any contact information about you or your company.


Please take a short amount of time to complete this important survey.




Survey Closed
 
Thank you for your participation. Your input will help provide a clearer understanding of how you and others are addressing the challenges of ACH processing and dispute management.
 
Check back here later for the final survey results and a discussion of the findings.

Wednesday, July 7, 2010

The state of storage

By Mark Brousseau (markbrousseau@tawpi.org)

Randy Davis (rdavis@egisticsinc.com) of eGistics, Inc. (www.egisticsinc.com) finds several interesting trends in The 2010 State of Storage Report from Networking Computing.

1. The top planned storage project for 2010 is improved allocation
2. Forty-seven percent of respondents say insufficient storage resources for mission-critical applications is their No. 1 concern
3. Storage area network (SAN) vendors are responding to demands for lower-cost storage
4. Storage virtualization is growing
5. Thin provisioning is catching on
6. There is a significant increase in interest in cloud-based storage

How do these trends reflect your storage strategy?

A welcome cloud during the economic recovery

By Ed Pearce (epearce@egisticsinc.com)

In spite of hopeful signs that the economy is on the mend, the 2010 State of Storage report from Network Computing finds that the fallout from the recession has left IT execs without the resources necessary to store the rising volume of information required to support their business applications.

Nearly half (47 percent) of the respondents to the survey say they have insufficient storage resources for their mission-critical applications, while 30 percent say they have insufficient tools for storage management. Another 30 percent of respondents say they have insufficient storage resources for departmental/individual use. Nineteen percent say they lack staff for their storage requirements.

And -- regardless of economic "green shoots" -- the situation isn't likely to change any time soon: 34 percent of respondents say they have an insufficient storage budget to meet their business demands.

Against this backdrop, it's little wonder that survey respondents are showing increased interest in cloud storage services (34 percent in 2010 versus 19 percent in the 2009 State of Storage report).

With a hosted variable cost storage model, if your business struggles, and your volumes drop, your operations costs will be aligned with your usage, and you won’t pay for a “just-in-case” capital investment. The variable cost model also eliminates the need for capital investment (software licenses and hardware) or maintenance contracts; customers typically are charged a one-time load fee to archive documents. And when an array fills up, or a server must be replaced, it’s your service provider’s problem. Using a thin-client interface, there may not even be software to install, manage or maintain. In addition, variably priced storage solutions can facilitate more effective operations by providing scalability that would be very cost prohibitive in a traditional, licensed in-house system.

CBA Chief Information Officer Michael Harte spoke for many users when he recently told the Committee for Economic Development in Australia that, "I will never implement an internal solution for a common problem that I could procure on subscription across the Web."

With the economic recovery still gaining strength, the trend for 2010 will be the more efficient use of existing IT resources. That should make hosted solutions a welcome cloud during the turnaround.

Putting the kibosh on the soaring software maintenance and upgrade costs

By Randy Davis (rdavis@egisticsinc.com)

Finextra reports that in a recent speech to the Committee for Economic Development in Australia (CEDA), CBA Chief Information Officer Michael Harte lambasted legacy technology vendors for their slow embrace of cloud-based computing and their apparent preference for solutions that lock-in users to a "never-ending spiral" of costly maintenance and upgrades.

"We're saying that we will never buy another data center. We will never buy another rack or server or storage device or network device again," Harte said. "I will never let any organization that I work for get locked into proprietary hardware or software again. I'll never tell my teams in the business that it will be weeks to get them hardware provision. I'll never pay upfront for any infrastructure and certainly would never pay for any, or rent any, infrastructure that I would never use."

Harte concluded: "I will never implement an internal solution for a common problem that I could procure on subscription across the Web."

With increasing demand for cloud-based solutions, combined with a general reluctance to pay hefty upfront capital costs, Harte's comments would seem to reflect growing dissatisfaction with the traditional licensed software model -- and its “never-ending spiral” of ongoing expenses.

Are you as fed-up as Harte?