Archive for May, 2010
Saturday, May 29th, 2010
By: Tim Bajarin
I had the privilege of speaking at last week’s Netbook Summit in San Francisco. The event was created well before the introduction of the iPad, when netbooks were still all the rage. Since Apple introduced its tablet, the value of netbooks over tablets has been called into question amongst consumers. This was one of the show’s major topics.
I had a session on Tuesday, in which I discussed how my analyst group, Creative Strategies, predicts the iPad will affect the market. During my presentation, I noted that, while netbooks aren’t going to go away, there will be a sort of bifurcation occurring around what we call a “content consumption versus content creation” focus, driving both product categories in the future. Laptops and netbooks are optimized for content creation. But in any given day, content creation only takes up 25 percent of the time most people spend on computers. The other 75 percent is spent consuming content.
People will always need laptops or desktops to do the heavy lifting when creating things like reports, long e-mails, managing their music and video libraries, and editing their photos and videos. But if content creation is important to a user, a laptop might prove a better choice than a netbook. People appreciate the portability of a netbook, but the devices’ small screens and keyboards are not ideal for content creation. Some people make netbooks work of them, but as someone who has used these devices from day one, I find them too small for serious content creation.
For the 75 percent of the time that consumption is the focus, however, tablets can be optimal. They are highly portable and offer an easier, more flexible way to surf the Internet, play games, read books, use mobile apps, and consume movies, video, and music. Tablets offer an easy-to-use touch interface and, thanks to the rich computing environment delivered by iPad and Android-based devices, they could become the one computing tool that people of all walks of life adopt and integrate into their mobile digital lifestyles.
All of the data presented at the Summit’s Market Research panel showed that demand for netbooks will peak this year, declining after that. And when Retrevo conducted a poll of over 1,000 individuals, it found that 78 percent were leaning toward buying an iPad, instead of a netbook. However, I think that the decline in netbook demand will have more to do with how PC manufacturers reposition their thin and light laptops. In fact, most market researchers don’t even put netbooks into a separate category—rather, they’re just counted as part of the overall laptop forecast.
Indeed, we just see netbooks as small laptops. And while the demand for netbooks still exists, we believe that there will be a big shift to create thin and lights with 11 and 13 inch screens with full keyboards. These devices will replace netbooks. The big issue, however, will be price. Netbooks can sell for as low as $299, while an 11-inch thin and light starts at $449. Take the Toshiba Satellite T115. It looks like—and is as heavy as—a netbook, but it has an 11.6 inch screen and full keyboard and costs $449. Consumers will end up weighing cost with functionality more closely if they are looking to purchase a lightweight and small notebook.
But the emerging tablet market is compelling for those would-be netbook buyers looking to consume content on a lightweight, truly mobile platform. This has more to do with the digital lifestyle than anything else. People want to access information anywhere at any time, and they want it in an easy-to-read format. This is why the iPad has struck such a solid chord with consumers. Creative Strategies’ research has shown that a lot of iPad early adopters take the device with them throughout the house, using it while on the couch, lying in bed, and lounging in the backyard, as they play games, watch movies, and download video at will.
There is a device coming out later this year that has the best of both worlds baked into a single product. The Lenovo U1, announced at this year’s CES, looks like a normal netbook, but the screen pops out to become a full-fledged tablet. In netbook mode, the device runs Windows XP. In tablet mode, it runs the Thunder Linux OS, which is optimized for Web browsing and mobile apps. It’s a great transition product for those who need a netbook but want the flexibility of a tablet. It could be hot, if priced right.
If you’re trying to decide between a netbook or a tablet, keep this in mind: if your digital lifestyle requires serious content creation or management, a laptop or notebook makes a lot of sense. However, if content creation is a low priority, an iPad or other tablet could really enhance the way you consume information.
Oh, and one more thing: my session at the Netbook Summit focused on whether the iPad will disrupt the overall PC market. My comments largely focused on the fact that the iPad is most interesting when it is used as a highly mobile content consumption device. In that sense, the device does represent a new paradigm in personal computing that really could prove disruptive to the way we think about computers.
I made a point, however, to reiterate that the device is not for everyone. A lot of people who use laptops will find it a nice supplementary product, one not at the center of their digital lifestyle. I concluded my speech by adding that the real market for tablets may be among non-PC users around the world who aren’t computer literate but want transparent access to the Internet. The iPad and similar tablets could ultimately be the devices to bring the next billion people into the world of technology.
Saturday, May 29th, 2010
Web search engines make our lives easier: They connect us with what we’re searching for in a matter of seconds, and sometimes they bring us to places we didn’t even know we were looking for.
But they can also teach us a lot about ourselves, as more than half of adult internet users already know.
About 57 percent of adult internet users in the United States said they have entered their name into a search engine to assess their digital reputation, according to a new Pew Research Center study “Reputation Management and Social Media.”
That’s a significant increase since 2006, when only 47 percent of adult internet users said they had looked their name up on a search engine. The findings show “reputation management has now become a defining feature of online life,” the study says.
This probably doesn’t come as a surprise to many, considering a new story about Facebook’s privacy settings surfacing each day.
And the concern about people’s digital reputations will most likely continue to grow as posting and sharing information over the internet becomes more and more widespread.
The study also found that young adults are more apt to “restrict what they share” and manage their online reputations more closely than older internet users. This is “contrary to the popular perception that younger users embrace a laissez-faire attitude about their online reputations,” wrote Mary Madden, a senior research specialist.
The Pew Research Center study, which took place by phone between August 18 and September 14, sampled 2,253 adults 18 and older. The margin of error is 2.3 percentage points.
Have you ever Googled yourself? Were you surprised by what you found?
Friday, May 28th, 2010
When Sony issued a recent PlayStation 3 update removing the device’s ability to install alternate operating systems like Linux, it did so to protect copyrighted content—but several research projects suffered collateral damage.
The Air Force is one example. The Air Force Research Laboratory in Rome, New York picked up 336 PS3 systems in 2009 and built itself a 53 teraFLOP processing cluster. Once completed as a proof of concept, Air Force researchers then scaled up by a factor of six and went in search of 2,200 more consoles (later scaled back to 1,700). The $663,000 contract was awarded on January 6, 2010, to a small company called Fixstars that could provide 1,700 160GB PS3 systems to the government.
Getting that many units was difficult enough that the government required bidders to get a letter from Sony certifying that the units were actually available.
Dirt cheap computing
Another grotesque waste of taxpayer dollars? Exactly the opposite, according to research lab staff. Off-the-shelf PS3s could take advantage of Sony’s hardware subsidy to get powerful Cell processors more cheaply than via any other solution.
“The Advanced Computing Architectures team at the Information Directorate considered several alternatives to arrive at the configuration of the proposed system, including the Sony BCU-100, IBM Blade Q22, and IBM PowerXCell 8i CAB accelerators cards,” said the Air Force last year. “In particular, the performance capabilities of the Cell Broadband engine were examined in considerable detail on each of the algorithms.”
The team also looked into using dual-quad-core Xeon servers for its cluster, going so far as to do a “detailed study of Xeon multithreading and SSE4 optimization on image processing intensive tasks.” The hardware worked well, and it eventually came to serve as subcluster headnodes that sit between the PS3 cluster itself and the control terminals.
But building the entire cluster out of Xeons would cost “more than an order of magnitude greater than the PS3 technology.” The team also looked into advanced GPGPUs but found that they worked best to “accelerate a subset of our algorithms, particularly the frontend processing and backend visualization, but lag the PS3 in the bulk of the calculations where processes need to intercommunicate and share memory beyond what is supported efficiently by the GPGPUs.”
The result was the 500 TeraFLOPS Heterogeneous Cluster powered by PS3s but connected to subcluster heads of dual-quad Xeons with multiple GPGPUs.
The Air Force team ordered the hardware, spent days unboxing it and imaging each unit to run Linux, and then… Sony removed the Linux install option a couple months later. (One can only imagine what happened to those 2,000 PS3 controllers and other unneeded accessories.)
Does it matter?
Sony’s decision had no immediate impact on the cluster; for obvious reasons, the PS3s are not hooked into the PlayStation Network and don’t need Sony’s firmware updates. But what happens when a PS3 dies or needs repair? Tough luck.
We checked in with the Air Force Research Laboratory, which noted its disappointment with the Sony decision. “We will have to continue to use the systems we already have in hand,” the lab told Ars, but “this will make it difficult to replace systems that break or fail. The refurbished PS3s also have the problem that when they come back from Sony, they have the firmware (gameOS) and it will not allow Other OS, which seems wrong. We are aware of class-action lawsuits against Sony for taking away this option on systems that use to have it.”
A similar issue will confront academic PS3 clusters, which have sprung up in labs across the country. In 2007, a North Carolina State professor built himself a small cluster that he cobbled together after “he spent a few hours one day in early January driving from store to store to purchase the eight machines.”
The University of Massachusetts has 16 machines networked into a cluster called the “Gravity Grid,” used to look at gravitational waves and black holes. According to the physicists at UMass, the PS3′s “incredibly low cost make[s] it very attractive as a scientific computing node, i.e., part of a compute cluster. In fact, it’s highly plausible that the raw computing power-per-dollar that the PS3 offers is significantly higher than anything else on the market today.”
All such projects will last as long as the machines survive or used machines are still available, but new hardware can’t be added and refurbished machines can’t be used. A class-action lawsuit has recently targeted Sony for removing a promised feature retroactively, though the issue is unlikely to be decided anytime soon.
We asked Sony for comment on how its decision would affect scientific computing clusters, but received no answer before publication.
A love affair with off-the-shelf consumer hardware
Such are the dangers of relying on consumer-grade hardware sold with a very different set of concerns from those that bedevil the scientists, especially in an era where firmware updates routinely alter functionality. But the Air Force, for one, has no plans to stop.
“The gaming and graphics market continues to push the state of the art and lowers the cost of High Performance Computing, FLOPS/WATTS per dollar,” the Air Force Research Laboratory told Ars. “This is important for embedded HPC, our area of expertise.
“The HPC environment is rapidly changing; leveraging technology that is subsidized by large consumer markets will always have large cost advantages. This gives us the experience (lesson learned) to develop HPC with low-cost hardware, benefitting the tax payer, Air Force, Air Force Research Lab while utilizing limited DoD budgets.”
Thursday, May 27th, 2010
There aren’t many details yet, but according to the guys over at BGR, some AT&T employees have confirmed that the iPhone 4G will be launched in June. That seems pretty fast since it’s scheduled to be unveiled in early June.
But maybe Apple wants to just get it over with at this point since the cat was let out of the bag awhile ago. If this is true you won’t have to wait long at all.
We’ll find out soon enough.
Wednesday, May 26th, 2010
Stories about a website hawking steering wheel mounts for iPads that would enable drivers to do their reading en route started popping up all over the interwebs last week. The idea, or course, is a safety proponents nightmare, and as one might imagine, it’s most likely a hoax. Jalopnik did some rudimentary research and was able to debunk the product as an almost certain prank.
Unfortunately, no one let USA Today in on the joke. They posted an item this morning called “Is Mounting Your Apple iPad On a Streeting Wheel Safe?” on their automotive blog, Drive On, which ended with the query, “Tell us what you think. Is this invention safe?”
“First of all, it’s a hoax,” one of Drive On’s readers commented. “Second of all, why is a newspaper publishing a hoax as if it were real?”
NPR has a theory. As USA Today pointed out in their coverage, the iPad in the video displays the magazine’s app:
A colleague sort of happened across it as he was surfing the Internet.
How could he help but notice that the makers of the “iPad Steering
Wheel Mount” had chosen USA TODAY’s app to demonstrate this device
aimed at allowing you to read while you drive.
Writes Bill Chapell on the NPR blog All Tech Considered, “This steering wheel mount may be a hoax — but it’s a capitalized one — note the Google Ads. And I had to think for a sec, ‘Hey, I wonder if this guy got a product-placement fee from USA Today?’ ”
USA Today disavows any complicity in the matter, stating in the today’s article, “Thankfully, we can’t get accused of endorsing this contraption because it’s apparently not available for sale right now. If you press the ‘buy’ button, you find they are sold out.”
(Hm, maybe a good sign these things were never for sale in the first place?)
At any rate, it’s fairly entertaining to watch the iPad display flip around in accordance to the whims of the driver.
Wednesday, May 26th, 2010
I try not to write too many of these open letters because, well, they’re a gimmicky way to hook readers on a Monday after a long week of news. But your relative silence since last Friday’s revelation that you collected personal data from unsecured Wi-Fi hot spots all over the globe shows you are underestimating the slow burn this incident has sparked among your user base, otherwise known as basically everybody on the Internet.
This isn’t like Facebook exposing the pictures from your 5-year college reunion, the one where you learned that no, you can no longer funnel beers quite so easily. This is every modern privacy advocate’s worst nightmare and every Google critic’s fantasy: the most information-hungry company the world has ever known has gotten caught going a little too far.
Sure, you claim the data collected as part of the Street View project was random and not necessarily identifiable. And yes, you were the one to notify the world what you had done, blaming it on an inadvertent oversight. Still, your blog post on the matter raises more questions than it answers.
For example, why did a Google engineer ever write code that was designed to, in your words, “(sample) all categories of publicly broadcast WiFi data”? For what possible reason could such comprehensive code be used other than to collect payload data from unsecured wireless access points?
You said you never used any of this data to help build or refine Google products. How do you know that? If this data was kept completely and totally separate from benign data gathered as part of the Street View project, how did you not realize that you were gathering this type of data years ago? It’s hard to believe that any form of data–the lifeblood of Google–could get tossed in the digital equivalent of a garage closet for years and forgotten.
It’s not enough to admit in the precise words of your co-founder that “we screwed up.” Pushing the boundaries and then apologizing after the fact is a business strategy that can only work for so long; you can’t fool all the people all the time.
Google collects more data on personal activities than just about anyone outside of the credit card industry, and most of the time that data improves your products and services. Yet your data-hungry culture can at times appear out of step with the mainstream world, and your tendency to brush off concerns about what your company might do with that data and how it protects that data troubles many who would otherwise see your company in the brightest of lights.
In 2003, the New York Times faced up to one of the worst crises in its history–the Jayson Blair fraud scandal–by publishing a thorough account of what had happened, how internal conditions at the paper allowed it to happen, and what would be done to prevent this from happening again. The painful exercise was cathartic for Times writers and readers, and went a long way toward restoring trust in one of America’s best news organizations.
You call yourself a company committed to openness and transparency? Prove it.
Publish a detailed account of why this Wi-Fi software was created, how it was allowed to permeate a high-profile Google project for several years, and what Google employees knew about the collection of this data. I know you love to remind critics of your data gathering that users have control over their data through features like Google Dashboard, but Google Dashboard only gives the user control over the data that Google tells that user they’re gathering.
It would be a grave mistake to let this matter go much further. Already governments skeptical of your power are licking their chops over this issue, and the lawsuits are also mounting.
You may be tempted to let the whole thing blow over and wait for Facebook to screw up some other privacy-related matter this week, diverting the nanosecond attention spans of the tech media and its readers. Don’t.
Earn back the trust you have so often stated is the contract between the users of your free services and your engineers. Explain clearly what was collected, how it will be deleted, and how this will never happen again.
Collecting data that users of your services submit willingly to the Internet is one thing. Driving the streets of the world and absorbing packets of data that come your way–no matter how inadvertent it may have been–is quite another.
Tuesday, May 25th, 2010
Facebook has fixed a flaw that let hackers delete Facebook friends without permission.
The flaw was reported Wednesday by Steven Abbagnaro, a student at Marist College in Poughkeepsie, New York. It was patched Friday afternoon, Pacific time, after the IDG News Service notified Facebook of the issue.
The bug was a variation of an earlier vulnerability that Facebook learned about last week, which affected a range of features on the Web site. Hackers could have leveraged Abbagnaro’s bug to delete all of a victim’s contacts, one by one, but it does not appear that anyone ever exploited it in a malicious way.
For Abbagnaro’s attack to work, however, a user would have to have been tricked into clicking on a malicious Web link while still logged into Facebook.
Facebook has struggled this week to fix these bugs, which are called cross-site request forgery flaws. They exist because of relatively simple Web programming mistakes in the Web site’s code, and security researchers have criticized Facebook for not fixing them more quickly.
“We’re in the process of doing a full audit and are building additional protections for this type of potential attack across the code base,” said Simon Axten, a Facebook spokesman, in a Friday e-mail interview. “We began working on this one as soon as we learned about it and pushed a fix early this afternoon.”
Monday, May 24th, 2010
When the U.S. economy tanked in 2008, companies were quick to rein in information technology spending. Now, amid signs of recovery, they risk problems by ramping up IT budgets too quickly to compensate. Market researcher Gartner (IT) forecasts that global IT spending will swing from a 4.6 percent decline in 2009 to a 4.6 percent increase this year, to $3.4 trillion. To ensure that new money is put to work wisely, CEOs should advise their chief information officers to focus on five strategies for smarter spending.
• Take stock of what’s broken and devise a plan to fix it
Deep budget cuts during a recession trigger ripple effects that can last long after sales growth resumes. As business rebounds, the consequences of deferring maintenance of computer systems become apparent when they strain to handle more transactions.
A big consumer electronics company we worked with responded to the smaller downturn last decade by cutting its IT spending nearly in half. When business improved, the company discovered that more than a third of its most critical systems were operating on outdated technology no longer supported by its vendors. Before signing off on new spending, companies need to look for hidden vulnerabilities as a result of earlier cutbacks.
• Get full potential from new spending
A recovery unleashes pent-up demand for new corporate initiatives and the IT systems that support them. The usual rationale for boosting spending is that it’s a competitive necessity. Managers may have a hard time resisting that argument.
Yet for the 15 cents of every IT dollar spent on new computer systems, like state-of-the-art customer management software, companies are spending 85 cents on less visible efforts to “keep the lights on.” These costs are for running ongoing operations, maintaining hardware, and patching software bugs. We’ve found that these follow-on expenses can cost anywhere from two to 10 times the original outlay on a system for many years.
Companies need to recognize the full cost of their new spending over time, and weigh that against the benefits a project will generate. That way what initially looked like a smart investment won’t turn into a money pit.
• Banish the complexity
As companies retrofit old computer systems by installing software patches to improve performance, or fail to fully integrate systems after a merger or acquisition, complexity ensues. The result is IT systems that can be slow to respond and out of sync with the processes they’re supposed to support. To strip out complexity, organizations must first stop adding more computer systems that only make the problem worse. They should also consolidate those with subpar performance.
Unnecessary complexity can be rooted in the businesses IT supports. The surest way to eliminate it is for each business unit to calculate what its costs, including support, would be if it offered just one bare-bones product. Then calculate how those costs increase as features are added back in. Most companies find their costs jump sharply at the points where added complexity starts to overload their IT capacity.
Knowing where those points occur and how to avoid them can mean the difference between profitable growth and middling performance. This disciplined approach let one financial-services company we worked with eliminate more than 40 middleware programs that sat between its operating systems and business applications, greatly simplifying vendor relationships and software maintenance.
• Take advantage of “good enough” technology
As executives look for a competitive edge in an improving economy, they should resist the temptation to write new software, or heavily customize vendors’ applications in every case. More than 80 percent of the time, our clients can meet their needs by taking advantage of off-the-shelf applications configured for their purposes.
The cost of customizing business applications can escalate in ways that aren’t always obvious. Modifications can prevent companies from taking full advantage of the usually superior, and less costly, enhancements vendors create themselves. Companies that rely too much on writing software themselves can also misuse in-house talent. Scarce IT resources should be tapped to give companies an edge in customer service, or boost revenue, profitability, or market share.
• Make outsourcing more strategic
Smart companies see improvements in business conditions as chances to reevaluate their outsourcing approach. Outsourcing companies can often be more productive and cheaper and produce higher-quality work than in-house employees. If they’re not, it may be time to consider new options.
Especially when a recovery is tentative, as this one is, outsourcing is a prudent way to accommodate increased demand that may not last, without committing capital to new assets or fixed costs. Outsourcers can also help companies offer cloud computing services. By locking in contracts on favorable terms early in a recovery, companies gain an edge over competitors.
When IT organizations start using these strategies, CEOs find they can get more out of their information technology investments. Taking these approaches can also make IT departments lean enough to keep delivering benefits when the next downturn inevitably arrives. That’s a deal most technology and line-of-business executives would be happy to make.
To discuss your IT strategies call Percento Technologies for cutting edge ideas: 800.614.7886 or click here.
Sunday, May 23rd, 2010
There are countless obstacles to achieving anything resembling innovation when outsourcing IT, but the biggest barrier is inertia. To combat the status quo, customers and suppliers have to shake things up, especially the traditional process for procuring IT services.
IT departments say they want innovation from their outsourcing vendors, and the vendors say they want to provide it. So why is innovation in outsourcing so rare?
There are countless obstacles to achieving anything resembling innovation when outsourcing IT, including ineffective change management, toothless governance, inadequate skills, perverse incentives and powerless managers. But the biggest barrier is inertia. When it comes time to draw up an outsourcing contract, everyone reverts to the safety of the status quo.
“One of the root causes behind lack of innovation in outsourced environments is an overemphasis on stability from buyers and service providers,” says Phil Fersht, founder of outsourcing analyst firm Horses for Sources. “After the contract is signed, buyer executives don’t want noise because they want to avoid second-guessing. The provider’s delivery executive wants all their dashboards to have green indicator lights Every action taken by both parties promotes stability, but hinders—even suppresses—innovation.”
To achieve innovation in IT outsourcing, customers and suppliers have to shake things up. And that starts with the traditional IT service procurement process of gathering requirements, issuing an RFP, selecting a vendor and signing a contract—that Holy Writ of the outsourcing relationship.
Ironically, contracting for innovation has precious little to do with the contract itself, say outsourcing experts and attorneys. While the contract codifies deal doctrine, in the most successful and innovative IT outsourcing relationships, it quietly gathers dust after the ink is dry. The contract is a consequence of a much more important negotiation—one that establishes a relationship between IT outsourcing customer and provider that will produce innovation while the legal documents sit on a shelf. To achieve that ideal relationship, all parties need to throw out the old notions that govern the traditional IT services procurement process and instead take the following approach.
1. Delay the RFP
In today’s world of urgent cost cutting and speed sourcing, there’s a rush to get the RFP out the door. But IT outsourcing customers need to decide upon innovation goals before even thinking about soliciting proposals or structuring the vendor selection process.
“If the enterprise wants any innovation, they should understand that the cookie-cutter RFP with the price-driven negotiation is not an effective vehicle,” says Bill Bierce, co-founder of technology law firm Bierce & Kenerson.
2. Define Innovation
It’s easier to agree on what innovation isn’t than what it is. “Innovation is not the service provider meeting or exceeding service level commitments,” says Fersht. “Those service levels are a component of the contractual agreement between the provider and the buyer, and thus should be met, plain and simple.”
True innovation might mean continuous process improvement, emerging technology implementation, new best practices, IT transformation or competitive advantage. A clear definition of innovation is required so that the contract will reflect the appropriate financial and other terms associated with it, says Daniel Masur, a partner in the Washington, D.C. office of law firm Mayer Brown.
The sad fact is, many IT departments have grown so consumed with keeping the lights on over the past few years that they “have lost touch with the innovative spirit and the knowledge of what innovation means to their firm and industry,” says Fersht.
Consequently, they rely on the outsourcer to define innovation for them, which puts the vendor in a difficult position, Fersht adds. A service provider can’t be expected to deliver significant innovation without knowing what types of innovation would help its client attain and maintain its strategic objectives, he says
Fersht recommends drawing up a strategic innovation plan and a process for updating it. It should outline the outsourced environment and those activities retained internally, and how to innovate within the new framework.
3. Use Outsourcers as Consultants
Attorney Bierce recommends to his clients that they approach IT service innovation as a consulting project and solicit recommendations for change from potential providers. “This poses some challenges for outsourcers who claim to have trade secret processes for industry verticals, and that they would be exposed by putting out their trade secrets into an environment where the enterprise customer would then just bid out the work to a third party on a commodity pricing basis,” says Bierce. “But this risk is small compared to the business opportunities.”
Indeed, IT service providers from IBM (IBM) and Infosys (INFY) to Accenture and CapGemini emphasize their consulting business as a complement to traditional IT outsourcing to take deals to a higher level.
“I think the best [IT service providers] start by seeking to understand the complex and sometimes unique needs of IT and business professionals,” says Forrester Research Senior Analyst Chris Andrews. “More and more, I see companies pointing to strategy sessions and methodologies to bring IT and business together to talk about the tactical and strategic role of technology.”
Michael S. Mensik, partner in the Chicago office of Baker & McKenzie, believes IT departments could better ensure true innovation by spending more time with vendors up front, before the contract is signed, examining and modeling precisely how innovation will be achieved. He says both parties should discuss the processes that will need to be put in place to further innovation, the investments that each party will need to make and the change management measures that will be required.
While suppliers may be willing to put in a little extra work up front to get your business, much of this consultation will come at a price. “Whatever the competitive pressures, there is just so much that the vendors will do as part of an RFP process,” Mensik says. “But I think in many cases the ROI on such an investment will be considerable. Coming up with a more detailed blueprint before committing to a vendor is, I think, one way of better ensuring success.”
4. Lock Everyone in a Room
When it comes to the quest for innovation in IT outsourcing, the phrase “too many cooks spoil the broth” doesn’t apply. Invite all key business and IT stakeholders and vendor executives to a conference room, advises Forrester Vice President and Principal Analyst John McCarthy. Then lock the door and hash out the laws that will govern the outsourcing relationship.
This approach ensures commitment from key internal stakeholders, which is important for outsourcing success, particularly transformational deals. “Any organization needing change has constituencies that will resist change,” says Bierce. “This is not the outsourcer’s problem but becomes its problem by default if the groundwork is not in place.”
Arguing over—and ultimately agreeing on—details of the deal establishes a framework for the conflicts destined to come up over the course of the relationship, says McCarthy.
“I asked a CIO from a Fortune 500 company who had just led his company through a huge transformation project with a leading services firm what he would have done differently, and he said, ‘I would have involved business decision makers in the process much, much earlier. I needed their insight and support to make this project work,’” says Forrester’s Andrews.
5. Loosen the Purse Strings
The average outsourcing selection and negotiation process focuses on one point above all else—price. But if you want innovation, you’re going to have to pay for it.
“Innovation costs the local account team money in terms of leveraging experts, process advancements or new technologies,” says Fersht. “But buyers are often reluctant to spend adequate funds on these efforts.”
Everyone wants value from outsourcers, particularly when times are tough, but stingy clients will get what they pay for, particularly if they haven’t been able to clearly define innovation pre-contract. “The interests of the parties must be aligned,” says Masur. “It is not realistic to expect a service provider to deliver the lowest possible price and still fund innovation initiatives and pass the resulting savings to the customer.”
Even if you think you’re paying a premium for innovation, it pays to verify the employee incentives put in place by the vendor. “Often the account team is very motivated to achieve a profit target and innovation is fluff that cuts into their discretionary funding,” says Fersht. Talk to the provider about unique compensation plans that encourage innovation on your account.
6. Share the Wealth
Of course, the provider as a whole needs some inspiration to innovate, too, particularly of the profit-boosting variety.
The concept of gain-sharing—rewarding the vendor when the client benefits from lower costs, increased revenue or improved efficiency—has always been a controversial one among outsourcing customers. But if there were ever a time to consider it, it’s when you’re seeking something above and beyond from outsourcing.
“I know how hard it is to consider gain-sharing. The discussion becomes a mini-joint venture, with issues of risk, reward, decisional authority, institutional impediments and shifting roles,” says Bierce. “But this kind of discussion can be valuable.”
You might set up a jointly funded pool to pay for agreed upon innovation initiatives or sharing of savings generated by particular innovation projects, says Masur. And the client doesn’t necessarily have to take a financial hit. The IT outsourcing customer might allow the provider to use the resulting products or systems to deliver services to other customers or waive its right to benchmark if a vendor consistently achieves high innovation scores in 360-degree performance reviews.
To discuss an innovative IT solution for your high powered organization, call Percento Technologies: 800.614.7886
Saturday, May 22nd, 2010
SAN FRANCISCO–Could the long-awaited marriage of the television and the Web be blessed by a search company?
Google is at least going to make an attempt, unveiling the signature announcement of Google I/O 2010, Google TV, before a crowd of developers at the Moscone Center Thursday. While Google will need developer support to make Google TV happen, the message wasn’t entirely aimed at them.
Instead, in convening a panel of some of the most important CEOs in the world of consumer electronics–Sony, Best Buy, and Intel, among others–Google declared its intention to shake up the world of consumer devices the same way it has disrupted countless other industries in its 12 years as an organization. Google is attempting to do what the PC and consumer electronics industries have tried–and failed–to do for years: bring the nearly unlimited content of the Web to the large-screen TV while preserving the tried-and-true television experience that has enraptured three generations of Americans.
If this effort succeeds, there will be a new power broker in consumer electronics. And Google will have found a way to move past its identity as The Search Company in order to focus on a future based around Web-connected consumer-oriented software.
It’s far from a slam dunk: powerful entrenched industries tend to not like it when Google comes knocking on their door. And tech conference demos alone–especially buggy ones–do not sell a product. But after the failed attempts of the Wintel duopoly (remember that?) to accomplish this goal in the last decade, Google is pushing ahead with its own take on the problem at a time when people might be finally ready to listen.
So what is Google TV? Essentially, it’s an Android-based operating system for televisions and set-top boxes that fulfills one of the key goals that eluded the PC industry years ago: seamless integration of Web content and cable or satellite content.
Intel and Microsoft wanted to put PCs in living rooms, attempting to dress them up to look like cable boxes or DVRs. However, people didn’t want to buy another full-fledged PC simply to sit in their entertainment centers and drown out the movie with the sound of the cooling fan. And the Windows brand did not resonate with the consumer electronics set, who didn’t want long boot times or PC weirdness when trying to fire up their favorite show.
Apple waded tentatively into these waters with Apple TV, providing a smaller and less obtrusive box for the living room but walling off the content experience to the iTunes Store and putting few resources behind the project. More recently, a host of other devices like Boxee, Roku, and Slingplayer have tried to deliver Internet content to the television, but they force the user to choose between “Internet mode” and “television mode,” and it’s amazing how reticent people are to hit a button to switch between input modes.
So could Google TV break this logjam? The promise is certainly there: offering bored TV viewers a better way to search for things that interest them seems like a winner. And layering the Internet over existing television is an idea that has shown some promise, in things like Yahoo’s work on TV widgets.
There are more than a few challenges. For one, nobody has any idea what these TVs and set-top boxes will cost relative to existing devices. People might be convinced to pay some sort of premium for this experience, but how much? These are uncharted waters.
And how will Google’s search technologies be implemented in this product? Mark Cuban, founder of Broadcast.com and HDNet, and avid NBA playoff spectator (as opposed to participant), nailed it when he said Thursday “the success of Google TV will come down to one thing…PageRank. Can you imagine the white hat and black hat SEO battles that will take place as video content providers try to get to the top of the TV Search Listings on Google TV?…How Google does its PageRank for this product will have a bigger impact on the success of the product in the TV market than anything else it does.”
But aside from the questions about Google TV itself, the announcement once again reveals Google’s limitless ambition. This is a company that honestly thinks it can provide better technology products and services than anyone else in the world.
People laughed when Google got into mobile operating systems, wondering how a search company could break into a market dominated by old hands like Nokia and RIM as well as new upstarts like Apple (which at least had the benefit of decades of world-class software development). That seems to have worked out well for Google: it’s the second largest smartphone operating system supplier in the U.S. at the moment, behind RIM and ahead of Apple.
There are few companies that could have assembled a CEO roster like the one Google put together Thursday. Coordinating the schedules of six major consumer electronics and computer industry CEOs must have taken a huge effort behind the scenes, and they weren’t even all in Las Vegas in January for CES. It was quite a list: Intel CEO Paul Otellini, Sony CEO Sir Howard Stringer, Logitech CEO Jerry Quindlen, Dish Network CEO Charlie Ergen, Best Buy CEO Brian Dunn, and Adobe CEO Shantanu Narayen.
As we alluded to earlier in the week, Google is reaching a point in its evolution where it is bringing the tech industry into its own orbit. Consider this: Intel and Sony played second fiddle to Google Thursday in an announcement that highlighted their own failures to produce such a product.
And however Google’s ruling triumvirate might feel about Apple CEO Steve Jobs and all he has accomplished over the years, Google could not have drawn clearer battle lines on Thursday: it wants to be as prominent a consumer electronics software company as Apple, and it is going about that strategy by marshaling industry support, rather than going it alone.