07 Apr Evaluate CA Technologies move to cloud service provision. What would you change about that move? What are the risks in making your suggested changes? What IT capabilities, if an
Questions
- Evaluate CA Technologies’ move to cloud service provision.
- What would you change about that move?
- What are the risks in making your suggested changes?
- What IT capabilities, if any, are required to make this move reasonable?
_______ Professo basis for manage Copyrig write Ha photoco
M A R C O
K E R R Y
CA
I’v compu
It Cloud and Techn techn comp CA Te and h and A
Fa prom weeks for c evolu as a se
Fa Analy were Techn of vir new Techn 3Tera analy object indus defen these
_______________
or Marco Iansiti an r class discussion. ment.
ght © 2011 Presiden arvard Business Sc
opied, or otherwise
O I A N S I T I
H E R M A N
A Techn
ve watched eve uting is going t
was barely 7 d Computing group execu
nologies, a $ ology (IT) sy anies and go echnologies h
had made a nu Arcot Systems
amularo had b inent CEOs, s. The presen
cloud compu utionary, and b ervice over th
amularo and ysts valued C a bit of a m
nologies in the rtualization an mainframe p
nologies clien a and Nimsof sts had voice tives and ide stry was rife nsive move, ra kinds of acqu
_______________
d Assistant Directo Cases are not inte
nt and Fellows of H hool Publishing, Bo reproduced, poste
nologie
ry evolution a to happen for s
7:00 a.m. on a g business, CA utive, Custom $4.4 billion c ystems mana overnments su had launched umber of stra s—which stre
been working chief inform
ntation focuse uting. Cloud by some even
he Internet.
Dobson had CA Technolog mixed bag ov
e summer an nd security p platform rele
nts. Additiona ft—as strateg ed caution, h entify a com with merge ather than for uisitions were
________________
or Kerry Herman, G ended to serve as e
Harvard College. T oston, MA 02163, o
ed, or transmitted, w
es: Brin
and revolution sure. Write it d
a cold mid-M A Technologi mer Solution company hea agement and upporting th a new strateg
ategic acquisi ngthened its
g on a strateg mation officers
ed on commu computing,
n as “disrupti
reviewed sev gies at about $ verall. On the
d fall of 2010, products, the ease from IB ally, analysts h gic additions highlighting C
mpelling way ers and large r any strategi e likely to fail
_______________
Global Research Gr endorsements, sou
To order copies or or go to www.hbsp without the permis
nging th
in this indust down. Put my n
March 2011 m ies, had just
ns Group, a adquartered security sof
hese organizat gy and under tions—includ cloud compu
gy presentatio s (CIOs), and
unicating and , described ive,” enabled
veral analyst $12.45 billion e one hand, m , citing the fir increasing m
BM that wou had called ou to the comp CA Technolo to position e companies c or product .
_______________
roup, prepared thi urces of primary da
request permission p.harvard.edu/edu ssion of Harvard Bu
he Clou
try, and we’re name on it if y
morning. Ada spent 30 min
at the comp in Islandia,
ftware solutio tions’ IT env rtaken a restr ding software uting offering
on Dobson w d chief techno positioning by some as
the delivery
t reports as th in early Janu many analys rm’s tradition
move of enterp uld mean ren ut the several pany’s portfo ogies’ need to and differen gobbling up alignment re
_______________
is case. HBS cases a ata, or illustrations
n to reproduce ma ucators. This publica usiness School.
ud to Ea
at another inf you want to.
— William
m Famularo, nutes with Da
pany’s Manh New York,
ons, primaril vironments. O ructuring to re e firms 3Tera, s.
would deliver ology officers CA Technolo s revolution of virtually li
hey prepared uary 2011;2 ho sts had been nal mainframe prises to the c newals and 2010 acquisit olio. On the o clearly stat ntiate its clo p smaller sta
easons. Witho
9-611- J U N E 2 7
________________
are developed solel s of effective or ine
aterials, call 1-800-5 ation may not be d
arth
flection point.
m McCracken,
, general man avid Dobson
hattan offices sells inform ly to Fortune
Over the past eorganize the , NetQoS, Nim
r to an audien s (CTOs) in
ogies’ new str ary, by som imitless IT cap
d the present owever, the re bullish abou e business, its cloud, as wel upgrades fo
tions—particu other hand,
te its strategy oud offerings artups, often
out a clear stra
-047 7 , 2 0 1 1
______
ly as the effective
545-7685, digitized,
Cloud
CEO1
nager, n, EVP s. CA mation e 2000 t year, e firm, msoft,
nce of a few rategy me as pacity
tation. eports ut CA s suite ll as a
or CA ularly some
y and s. The n as a ategy,
For the exclusive use of M. Awadallah, 2023.
This document is authorized for use only by Maiami Awadallah in ISM4151-Spring 2023 taught by POUYAN ESMAEIL ZADEH, Florida International University from Dec 2022 to May 2023.
611-047 CA Technologies: Bringing the Cloud to Earth
2
Famularo thought back over the criticism CA Technologies had weathered; industry pundits claiming that the mainframe—CA Technologies’ bread and butter—was dead and would go away. In fact, he had the opposite “problem.” In 2010, analysts reported that CA Technologies’ mainframe business was strong and accounted for over 60% of the firm’s revenues and a majority of its profits, with no sign of slowing down.3 Moreover, Famularo knew CA Technologies’ line of products was anything but outdated, having evolved to include solutions to help enterprises manage and secure their increasingly complex IT environments, incorporating mainframe and distributed platforms, web services and an expanding array of mobile devices. But given the attractiveness of CA Technologies’ current mainframe and distributed business lines, how firmly would the company back the emergent cloud business? How would the company handle technological evolution and its associated need for transformation? What would CA Technologies need to do to drive the growth of its cloud business? What strategies, systems and processes would have to be put in place to drive scale and profitability? Most importantly, should CA Technologies’ cloud business be positioned as an incremental, complementary addition to its mainframe and client-server businesses or should it be framed as a disruptive and entirely revolutionary technology?
Company Background
In 1976, Charles B. Wang and Russ Artzt, fellow-Queens College mathematics graduates, and a handful of colleagues founded New York-based Computer Associates. “We both had an entrepreneurial spirit and we wanted to innovate and build something,” recalled Artzt, CA Technologies co-founder and vice chairman. Information technology management at that time presented significant problems, with “lots of gaps,” said Artzt, such as the lack of secure and protected master data files. At that time, data was stored on disks or magnetic tape, and, according to Artzt, “anyone could potentially change or delete the data on the file. There were no controls.” CA Technologies started with a focus on IBM’s mainframe market, introducing CA SORT, which delivered full-function sort, merge and copy capabilities, and helped the young company hit $5 million within a year. Wang fostered a competitive culture internally, pushing long working hours and offering free breakfast to “get employees in the building and working quickly.”4
In 1981, the company went public, selling 500,000 shares to raise $3.2 million.5 CA Technologies focused on developing mainframe software that streamlined and automated mainframe operations, lowering cost and freeing IT staff to work on more strategic projects. Fueled by early success, CA Technologies pursued an aggressive growth strategy based on internal development and acquisitions. By the late 1980s, CA Technologies mainframe software portfolio had broadened to include workload scheduling software, security management products, helpdesk, event automation software, mainframe performance optimization, tape and media management, application development and databases. CA Technologies grew to hold the number one or two market share position in many of these market segments. When the company acquired Uccel Corporation in 1987 it became the largest independent vendor of mainframe infrastructure software, and in 1989, it was the first software company to exceed $1 billion in revenues. Into the 1990s the company continued to provide new products aimed at security, relational database management, and systems management. CA Technologies was the first to develop a software architecture providing a unified development environment for all multi-platform enterprise solutions. With its Unicenter solution, the company became the first to deliver cross-platform solutions for UNIX and other open systems, and pioneered a product to integrate network and systems management.
For the exclusive use of M. Awadallah, 2023.
This document is authorized for use only by Maiami Awadallah in ISM4151-Spring 2023 taught by POUYAN ESMAEIL ZADEH, Florida International University from Dec 2022 to May 2023.
CA Technologies: Bringing the Cloud to Earth 611-047
3
CA Technologies continued to expand beyond the mainframe into client-server, or distributed computing, with a series of acquisitions, including Platinum Technology in 1999. By the mid-1990s, the company had expanded to Asia, Africa and Latin America, and offered a full suite of products for Microsoft Windows NT, and formed partnerships with both Microsoft and Netscape at a time when the two firms were in a head-to-head battle for dominance in the browser wars.
Customers of CA Technologies typically purchased a license to use the specified software for a period of three to five years, including software updates and technical support. From early on, CA Technologies adapted with its customers’ changing business needs, pioneering flexible licensing (FlexSelect Licensing) which offered customers a subscription-like sales model enabling them to pace their technology investments to their business growth.6 By 2002, with a growing portfolio of customers and product offerings, CA Technologies created manager-led business units to better align its software solutions with customer needs.7 CA Technologies maintained a highly effective executive sales force, with deep connections into the CIOs’ offices and IT organizations of the Global 2000.
Wang transitioned out as CEO in 2000, and in April 2001, a New York Times article appeared suggesting CA Technologies’ management had inaccurately reported earnings through various accounting practices by recognizing revenues from software contracts in quarters before the contracts were signed.8 The company’s board directed the firm’s general counsel to investigate the allegations. In 2003, federal investigators remained concerned, and in October, the audit committee announced that some contracts had been backdated by CA Technologies employees. In February 2005, an independent directors committee was formed, which included William McCracken a newly appointed board member and long-time, senior IBM executive. As a member of a special litigation commission, McCracken recalled, “We dug deep into the company to try to understand the extent of the accounting issues. It gave me an up-close and personal view of the company.”9
In 2005, with a new CEO, industry veteran John Swainson, at the helm, CA Technologies unveiled its new logo, launched a new global branding program, “Believe Again,” and released 26 new versions and 85 new products—its largest release to date—all under the umbrella of enterprise IT management (EITM), the firm’s push to unify and simplify the management of enterprise-wide IT. Several strategic acquisitions bolstered CA Technologies’ capabilities, including Concord, Wily, Netegrity, and Niku.
In 2006, CA Technologies officially changed its name to CA, Inc. and began using the tag line “Transforming IT Management,” to underscore the firm’s primary differentiator: the ability to unify elements of IT and simplify complex IT management. Over the next four years, the company continued to build upon its EITM portfolio, garnering recognition from top IT industry analyst firms as a leader or leading provider in multiple IT management market segments, including project and portfolio management, web access management, identity and access management, user provisioning, records management, job scheduling and data discovery solutions, to name a few.
Evolution of an Industry: IT Management
Every 15 years or so a set of technology advances enables a new model at the same time that businesses realize the current approach is running out of gas. The PC wasn’t just a little mainframe. It changed the way people interacted with computers. The Web wasn’t just a different kind of PC. It radically changed the way people interact with businesses. The cloud is the same. Cloud is not just a new way to run data centers. It is a completely new approach to IT.
— Donald Ferguson, executive vice president and Chief Technology Officer
For the exclusive use of M. Awadallah, 2023.
This document is authorized for use only by Maiami Awadallah in ISM4151-Spring 2023 taught by POUYAN ESMAEIL ZADEH, Florida International University from Dec 2022 to May 2023.
611-047 CA Technologies: Bringing the Cloud to Earth
4
The history of mainframes was tied to the evolution of IT and had its roots in the 1960s when computing and computers became common in mainstream business functions (Exhibit 1a provides a diagram of the waves of business and government investment in IT as percentage of GDP from the 1950s to 2016 (forecast), Exhibit 1b provides a simple diagram of the stages). Early mainframe computers ran software applications to help manage and automate a company’s many processes, from payroll and accounting tasks to scheduling production processes; these tasks soon grew to encompass a wider variety of specialized business applications, including enterprise resource planning and warehouse management, vertical applications for industries such as medical, telecommunications, insurance and finance, and storage of mission-critical consumer and employee data.
In 1964, IBM’s System/360 set a new mainframe standard,10 in part because it was designed to be upward compatible, enabling future generations of mainframes to run older applications unchanged. Businesses adopted mainframe computers, confident that investments in expensive custom software development would not be rendered obsolete, even as computing architectures advanced. Indeed, even IBM’s latest (2010) mainframes could generally execute programs developed in 1964 for the IBM System/360. The IBM System/360—and later the IBM System/370 and subsequent architectures— have been the most successful and long-lived in history, driving a massive, almost 50-year build-up of applications and mainframe software.
Beginning in the late-1970s, the business computing environment saw the introduction of word processors (Wang 1200) and personal computers (the IBM PC, introduced 1981), allowing employees greater flexibility and autonomy in running software. Spreadsheet, word processing, presentation and other desktop publishing capabilities soon followed, and throughout the 1980s, personal computers spread through most businesses. The mainframe continued to underpin a company’s IT systems, however, connecting and supporting all of these machines and their functions across the new platforms, while it continued to run core functions.
As use of computing in businesses grew, complexity increased, and organizations began to spend more of their IT budgets for ongoing operation and maintenance. By the early 1980s, one-third of a mainframe customer’s costs went to hardware; programming and services made up about two-thirds of their costs.11 Unlike other office machines, such as typewriters or cash registers, computers altered a company’s information systems, keeping a dedicated team of engineers and programmers busy creating new applications as well as managing and troubleshooting old ones. Complexity and cost were the result, creating motivation for IT management companies, such as CA Technologies.
Prior to the emergence of software vendors such as CA Technologies, mainframe operating systems and basic utility software was supplied by the hardware company (e.g., IBM), with specific business applications custom developed by each organization to meet their own unique requirements. Companies such as CA Technologies recognized that increasing complexity and cost associated with running mainframe computers created a need for software that managed and secured the mainframe, improving its ability to process work while reducing the total cost of ownership.
CA Technologies and other software management companies developed software that streamlined and automated mainframe operations, lowering cost and freeing IT staff to work on more strategic projects. Armed with sophisticated mainframe management software, businesses felt comfortable that they could handle additional complexity as their IT systems grew. Internal IT groups leveraged new mainframe technologies, permitting large-scale IT consolidation and creating immense datacenters based on larger and larger servers.
For the exclusive use of M. Awadallah, 2023.
This document is authorized for use only by Maiami Awadallah in ISM4151-Spring 2023 taught by POUYAN ESMAEIL ZADEH, Florida International University from Dec 2022 to May 2023.
CA Technologies: Bringing the Cloud to Earth 611-047
5
With the advent of lower-cost systems in the 1980s, and multi-module application software, which enabled enterprise management software to run across (cheaper) smaller machines connected over networks, more and more businesses could afford these systems and applications to run their computing needs. Yet, despite the claim for its “imminent demise” as these newer and less expensive resources became available, the mainframe continued to serve as the base support for the enterprise’s core IT needs, networks and computing functions.
In time, IT management was characterized by network computing (1992-2008), which drove additional process automation with the introduction of ERP software, customer relationship management software and supply chain management software; all based on client/server or distributed computing architecture. Distributed servers increasingly handled an enterprise’s departmental and other workloads, emulating such mainframe functions as partitioning capabilities, virtualization technologies and workload management controls.12 The rise and maturation of distributed computing allowed core IT functions to expand beyond the mainframe, requiring management capabilities for these environments similar to those that had arisen earlier for the mainframe. The enterprise’s IT capabilities became yet more complex as the Internet and e-commerce software drove enterprise applications into sales and purchase processes over the Web. At the same time, the mainframe continued to mature and support a broader range of software vendors and offerings.13
By 2000, software as a service (SaaS), application service providers (ASPs), and hosting services took hold, offering applications or the ability to run companies’ software in remote data centers, for a monthly or yearly fee, obviating the need to invest and maintain infrastructure. Increasingly, IT customers became more comfortable with the knowledge that “their” applications ran remotely.14 The parallel development of browser-based access to many applications increased employee mobility (and complexity for IT groups), and fostered even more proliferation of technology devices connected to the network.
By 2008, despite the continued claim of its imminent demise, mainframe sales continued to grow; one analyst estimated that mainframes were home to “70% of the world’s critical transactional data” that year.15 Virtualization, an idea whose roots traced to the mainframe, was becoming increasingly important in the distributed computing environment as well, in three forms (system, platform and hardware). The market for mainframe software held an 8.5% share of the overall 2009 software market, estimated at $272 billion.16 (Exhibits 2 to 4 provide data on worldwide software revenues.)
The Mainframe Context in 2010
By 2010, companies’ IT profiles had changed dramatically since the early 1960s, yet despite most common perceptions, the mainframe—often now in the guise of a server—still served as the core resource and system base for some 4,000 to 5,000 major organizations around the globe and in 2010, was a $49 billion industry.17 Concomitant spending on power and coding expenses, and management costs brought that number close to $200 billion (see Exhibit 5). Enterprise IT expenditure, as a function of capital expenditure, had gone from 2% in 1965 to 25% in 2011.18 The mainframe supported a complex ecosystem of software surrounding it: ranging from system software to run the machines; application software to execute tasks; and security and compliance software.19
Changes in hardware and software had introduced layers of additional complexity sitting on top of the original technologies that underpinned a company’s functions. As Ajei Gopal, executive vice president, Technology and Development Group, noted:
For the exclusive use of M. Awadallah, 2023.
This document is authorized for use only by Maiami Awadallah in ISM4151-Spring 2023 taught by POUYAN ESMAEIL ZADEH, Florida International University from Dec 2022 to May 2023.
611-047 CA Technologies: Bringing the Cloud to Earth
6
In 1995 we saw Netscape’s IPO, HTML, browsers, in the context of dial-up access and some broadband penetration, business need opened up access to customer 24/7. In 2010 we are witnessing the cloud: SaaS, social networking ranging from sharing of pictures etc., Web 2.0, distributed computing. These things add on to each previous breakthrough. Nothing goes away. It takes a generational shift to change.
As the IT environment had evolved, total cost of ownership (TCO) issues played a role in the enterprise’s hardware decisions. Servers in a distributed computing environment, for example, typically cheap and straightforward, could offer additional capacity, flexibility, reliability, and distribution of an enterprise’s data, which mitigated risk. However adding hardware often meant hiring more IT professionals—the most expensive side of any IT equation—to manage and maintain the environment. Alternately, mainframes could represent a significant upfront expense, however once a mainframe was in place, an enterprise could often add capacity without impacting the resources needed to manage the environment. As one observer noted, “In an existing mainframe environment, many of the initial costs associated with deploying a new solution already have been paid.”20
The role of the CIO had changed over time as well, becoming more business-need driven, making CIOs increasingly forced to function as service level managers. The demographics of IT engineers had also changed. By the mid-2000s, most graduating computer engineers had never used a mainframe, and received no instruction in mainframe technology; many had to learn mainframe skills on the job.21 As one observer wrote, “Teaching mainframe skills is out of vogue at many universities with the advent of newer approaches to solving the biggest computing challenges.”22 In 2010, the average age of mainframe workers was 55 to 60 years old. Many feared the lack of talent would prompt companies to adopt smaller, distributed servers to run networks, Web operations and the range of other computing tasks run by mainframes.23
By 2010, the number of mainframes dropped to about 17,000 from 30,000-to-40,000 in earlier decades.24 Two factors were cited: the emergence of competing technologies (servers based on Intel semiconductors) and newer, more adept and efficient mainframes that could manage the work requiring several machines in the past.25 According to one source, IBM’s 2010 mainframe ran up to 60% faster than models introduced in 2008, using the same amount of power and cutting service costs by as much as 70%.26 When compared to models from earlier decades, mainframes in 2010 generally offered thousands of times more capacity, as one CA Technologies executive noted, “So the reduction from 30,000-to-40,000 to 17,000 mainframes actually represented a significant increase in net capacity that had been growing at 20% a year since 2001.” (Refer to Exhibit 5.)
The Next Step for IT: Cloud Computing
Toward the end of the first decade in the 21st century, cloud computing dominated IT discussions. Cloud computing was based on nearly all of the concepts that were part of a mainframe environment, but was centered on a dynamic pool of resources, in many cases outside an organization’s own IT environment. At its base, the cloud computing model assumed a shared utility approach of applications, typically stored or accessed remotely via a public or private network, featuring a move for the enterprise away from owning and operating their own software and hardware (Exhibit 6 provides illustrations). Hotmail, Yahoo! Mail, Google Mail, as well as Flickr and other similar social sharing sites were examples of popular (commercial) cloud services. As one observer noted:
The big-picture concept underpinning cloud computing is that the economic efficiencies associated with mega scale service providers will be compelling. [. . .] If you accept the fundamental premise of such efficiencies, the future of computing lies in multitenant shared
For the exclusive use of M. Awadallah, 2023.
This document is authorized for use only by Maiami Awadallah in ISM4151-Spring 2023 taught by POUYAN ESMAEIL ZADEH, Florida International University from Dec 2022 to May 2023.
CA Technologies: Bringing the Cloud to Earth 611-047
7
facilities of massive size (where size does not necessarily mean a single facility, but a shared resource pool that is most likely geographically distributed).27
Analysts pegged potential savings for enterprises using the cloud at about 40%; one analyst projected cloud computing would save Europe’s five largest economies as much as £645 billion (about $920 billion) from 2010-2015.28
As early as 2008, cloud computing reflected a convergence of many factors in the IT environment, including pervasive virtualization, fast application and service provisioning, elastic response to load changes, low-touch management, network-centric access, and the ability to move workloads from one location to another.29 (Exhibits 7 and 8 outline some of the IT supply chain and management aspects addressed by cloud computing.) Industry insiders were divided about the opportunities, and even conflicted about simple definitions of cloud computing. And while most agreed cloud computing was “bringing enormous power to large and small organizations,”30 many decried the hype around cloud computing; a Forrester analyst called cloud computing “overblown,”31 and Oracle CEO Larry Ellison criticized the computer industry saying, “[W]e’ve redefined cloud computing to include everything we already do. I can’t think of anything that isn’t cloud computing with all of these announcements. [. . .] Maybe I’m an idiot, but I have no idea what anyone is talking about. What is it? It’s complete gibberish. It’s insane. When is this idiocy going to stop?”32
Cloud computing had its skeptics for good reason. Some argued that applications and services would continue to run both inside enterprise firewalls and in the cloud, for reasons of technology, switching costs and control. As an observer noted, “Many applications were written with a tightly coupled system architecture in mind [. . .] and can’t simply be moved to a more loosely coupled cloud environment.”33 For legacy applications, there were significant switching costs and time required to move to a new software model. Security and compliance were als
Our website has a team of professional writers who can help you write any of your homework. They will write your papers from scratch. We also have a team of editors just to make sure all papers are of HIGH QUALITY & PLAGIARISM FREE. To make an Order you only need to click Ask A Question and we will direct you to our Order Page at WriteDemy. Then fill Our Order Form with all your assignment instructions. Select your deadline and pay for your paper. You will get it few hours before your set deadline.
Fill in all the assignment paper details that are required in the order form with the standard information being the page count, deadline, academic level and type of paper. It is advisable to have this information at hand so that you can quickly fill in the necessary information needed in the form for the essay writer to be immediately assigned to your writing project. Make payment for the custom essay order to enable us to assign a suitable writer to your order. Payments are made through Paypal on a secured billing page. Finally, sit back and relax.
About Wridemy
We are a professional paper writing website. If you have searched a question and bumped into our website just know you are in the right place to get help in your coursework. We offer HIGH QUALITY & PLAGIARISM FREE Papers.
How It Works
To make an Order you only need to click on “Order Now” and we will direct you to our Order Page. Fill Our Order Form with all your assignment instructions. Select your deadline and pay for your paper. You will get it few hours before your set deadline.
Are there Discounts?
All new clients are eligible for 20% off in their first Order. Our payment method is safe and secure.