• Showing posts with label sustainability. Show all posts
    Showing posts with label sustainability. Show all posts

    Friday, September 28, 2012

    A Lean Greentech Approach

    I am a greentech enthusiast and I have been closely following the greentech VC investment landscape. The VCs like Kleiner Perkins who have had a large greentech portfolio including companies such as Bloom Energy are scaling down on greentech investment. Their current investment is not likely to get any returns close to what a VC would expect. The fundamental challenge with such greentech (excluding software) investment is that they are open ended capital-intensive; you just don't know home much time it would take to build the technology/product, how much it would cost, and how much you would be able to sell it for. The market fluctuations make things even worse. This is not only true in the case of start-ups but also true for the large companies; Applied Materials' grand plan to revolutionize thin-film solar business ended up in a bust.  

    There's a different way to approach this monumental challenge.

    Just look at how open source has evolved. It started out as non-commercial academia projects where a few individuals challenged the way the existing systems behaved and created new systems. These open source projects found corporate sponsors who embraced them and helped them find a permanent home. This also resulted in a vibrant ecosystem around it to extend those projects. A few entrepreneurs looked at these open source projects and built companies to commercialize them with the help of VC funding. Time after time, this business model has worked. Technologists are great at building technology, companies are great at throwing money at people, entrepreneurs are great at extending and combining existing technology to create new products, and VCs are great at funding those companies to help entrepreneurs build businesses. What VCs are not good at is doling out very large sum of money to bet on technology that doesn't yet exist.

    If we need to make it work, we need a three-way relationship. People in academia should work on capital-intensive greentech technology projects that are funded by corporations through traditional grants. These projects should become available in public domain with an open source like license or even a commercial license. The entrepreneurs can license these technology, open source or not, and raise venture money to build a profitable business. The companies that are constantly contributing their greentech initiatives to public domain should continue to do so. and Google continues to share their green data center design.

    The important aspect is to differentiate technology from a product. The VCs are not that good at investing into (non-software) technology but are certainly good at investing into products. For many greentech companies, technology is a key piece such as a battery, a specific kind of a solar film, a fuel cell etc. Commercializing this technology is a completely different story. This requires setting up key partnerships such as and Israeli government committing to a nationwide all-electric car infrastructure with Better Place.

    Many large companies have set up their incubators or "labs" to find something that is fundamentally disruptive that could help their business. Later, there have been a very few success stories of these incubators or labs because the start-up world is way more efficient to do what big companies want to do. These labs are also torn between technology and products. My suggestion to them would be to go back to what they were good at - hiring great scientists from academia and working with academia on the next-generation technology to create a business model by either using that technology in your products or to license it to others who want to build business. This shifts the investment from a few VCs to a relatively large number of corporations.

    What we really need is a lean greentech approach.

    Photo Courtesy: Kah Wai Lin

    Thursday, July 9, 2009

    Chief Sustainability Officer - the next gig for a CIO

    CIO no longer means Career Is Over. CIOs should not underestimate their skills and organizational clout to lead the company in its sustainability efforts by being a Chief Sustainability Officer (CSO).

    Leverage relationship with the business: As a CIO you work closely with the business and have holistic understanding of the challenges that the business faces and the growth opportunities that they aspire to go after. You can leverage the relationship with the business to own and execute the sustainability strategy and effectively measure and monitor the progress using the expertise and investment into the IT systems. You can walk your business folks through your scenario-based architecture to help them quantify the business impact of the sustainability initiatives and estimate the required transformation efforts.

    Start with Green IT and lead the industry: Start with the area that you are most familiar with. Reduce the carbon footprint of your IT systems by improving the PUE of the data centers and better manage energy consumption of the desktops. If you do decide to disinvest into the data centers and move tools and applications to the cloud it will not only reduce the energy cost but would also result in consuming cleaner energy. Share your best practices with your industry peers and lead your industry in the sustainability efforts.

    Make Sustainability a business differentiation: For many organizations sustainability is not just a line item in the corporate responsibility report, it is actually the future growth strategy and a sustainable competitive advantage over their competition e.g. sustainable supply chain, higher operating margins, end-to-end environmental compliance etc. As a CIO you have the right weapons and skills in your arsenal to transform the organization in the sustainability initiatives. You could innovate your company out to grow leaps and bounds by focusing on the sustainability. This could be a blue ocean strategy for many organizations that are struggling in the red ocean to beat the competition. You do have an opportunity to empower your customers in their mission to be sustainable by providing them the data that they need e.g. a bill of material with the carbon footprint and recycle index, realtime energy measurement etc.

    Redefine the program management office: The sustainability projects are very similar to IT projects in many ways - make a large set of stakeholders to commit without having much influence on them, work with internal employees, customers, and partners etc. Traditionally you have been running the program management office for technology and information management projects. Apply the same model and leverage skills of your program managers to run sustainability projects internally as well as externally. Sustainability is fundamentally about changing people's behavior. Promote alternate commute program tools such as RideSpring, carbon social networks such as Carbonrally, and employee-led green networks such as eBay Green Team. Run targeted campaigns to reduce energy and paper consumption, increase awareness, and solicit green ideas. Right kind of tools with an executive push and social support could create a great sustainable movement inside an organization.

    Chief Sustainability Officer is an emerging title. Your ability to work across the organization, leverage relationship with the business to sell them on the sustainability goals, and manage the tools that are penetrated in all parts of your organization make you well suited for this role. A CSO does not necessarily have to be a domain expert in sustainability. In fact I would expect a CSO to be a people-person that can make things happen with the help of the sustainability experts and visionaries.

    Now you know what your next gig looks like.

    Thursday, June 18, 2009

    Cloud Computing At The Bottom Of The Pyramid

    I see cloud computing play a big role in enabling IT revolution in the developing nations to help companies market products and services to 4 billion consumers at the bottom of the pyramid (BOP). C.K.Prahlad has extensively covered many aspects of the BOP strategy in his book Fortune At The Bottom Of The Pyramid that is a must-read for the strategists and marketers working on the BOP strategy.

    This is how I think cloud computing is extremely relevant to the companies that are trying to reach to the consumers at the BOP:

    Logical extension to the mobile revolution: The mobile phone revolution at the BOP has changed the way people communicate in their daily lives and conduct business. Many people never had a landline and in some case no electricity. Some of them charged their mobile phones using a charger that generates electricity from a bike. As the cellular data networks become more and more mature and reliable the same consumers will have access to the Internet on their mobile phones without having a computer or broadband at home.

    The marketers tend to be dismissive about the spending power of the people at the BOP to buy and use a device that could consume applications from the cloud. BOP requires innovative distribution channels. The telcos who have invested into the current BPO distribution channels will have significant advantage over their competitors. The telcos, that empowered people leap frog the landline to move to the mobile phones, could further invest into the infrastructure and become the cloud providers to fuel the IT revolution. They already have relationship with the consumers at the BOP that they can effectively utilize to pedal more products and services.

    Elastic capacity at utility pricing: The computing demand growth in the developing countries is not going to be linear and it is certainly not going to be uniform across the countries. The cloud computing is the right kind of architecture that allows the companies to add computing infrastructure as demand surges amongst the BPO consumers in different geographies. Leaving political issues aside the data centers, if set up well, could potentially work across the countries to serve concentrated BOP population. The cloud computing would also allow the application providers to eliminate the upfront infrastructure investment and truly leverage the utility model. The BOP consumers are extremely value conscious. It is a win-win situation if this value can be delivered to match the true ongoing usage at zero upfront cost.

    Cheap computing devices: OLPC and other small handheld devices such as Netbooks are weak in the computing power and low in memory but they are a good enough solution to run a few tools locally and an application inside a browser. These devices would discourage people from using the applications that are thick-client and requires heavy computation on the client side. The Netbooks and the introduction of tablets and other smaller devices are likely to proliferate since they are affordable, reliable, and provide the value that the BOP consumers expect. Serving tools and applications over the cloud might just become an expectation, especially when these devices come with a prepaid data plans.

    Highly-skilled top of the pyramid serving BOP: Countries such as India and China have highly skilled IT people at the top and middle of the pyramid. These people are skilled to write new kind of software that will fuel the cloud computing growth in these emerging economies. The United States has been going through a reverse immigration trend amongst highly skilled IT workers who have chosen to return back to their home countries to pursue exiting opportunities. These skilled people are likely to bring in their experience of the western world to build new generation of tools and applications and innovative ways to serve the people at the BOP.

    Sustainable social economies: It might seem that the countries with a large BOP population are not simply ready for the modern and reliable IT infrastructure due to bureaucratic government policies and lack of modern infrastructure. However if you take a closer look you will find that these countries receive a large FDI [pdf] that empowers the companies to invest into modern infrastructure that creates a sustainable social economy.

    Most of the petrochemical refineries and cement manufacturing plants that I have visited in India do not rely on the grid (utility) for electricity. They have set up their own Captive Power Plants (CPP) to run their businesses. Running a mission critical data center would require an in-house power generation. As I have argued before, local power generation for a data center will result into clean energy and reduced distribution loss. There are also discussions on generating DC power locally to feed the data centers to minimize the AC to DC conversion loss. Relatively inexpensive and readily available workforce that have been building and maintaining the power plants will make it easier to build and maintain these data centers as well. The local governments would encourage the investment that creates employment opportunities. Not only this allows the countries to serve BOP and build sustainable social economy but to contribute to the global sustainability movement as well.

    Tuesday, March 31, 2009

    Design Thinking Sustainability

    The designers have been designing tools, processes, and methods to support and not to change people's behavior. In contrast designing for sustainability would fundamentally require changing people's behavior. The behavior change to achieve the sustainability goals would mean offering different alternatives, encourage reduced consumption, make people conscious of their behavior, and leverage peer pressure and competition. The design that maintains status quo will not help to achieve sustainability goals. The design will have to be provocative and challenge user's assumptions in many ways.

    Design Thinking is about how you think and not what you know; it is about the journey and not the destination. For a problem of massive scale such as sustainability where we still know a little and the desired outcome may take years, following are some elements of design thinking that could help make world a better place to live for the generations to come.

    Ambidextrous-thinking: Sustainability being fundamentally a sociological, psychological, and economical problem designers not only need to synthesize what they observe but to also design their solutions based on well-analyzed hard facts. It requires the designers to use both sides of their brains, left and right, to feel and to think. Human beings respond to positive and negative incentives e.g. charging people for grocery bags, allow hybrids cars in the carpool lanes etc. Ambidextrous approach allows designing creative incentives such as showing real-time gas consumption in Prius that changes the driver's behavior. It also prevents blindly rolling out initiatives that feel right but are outright wrong such as paper bags instead of plastic. Even though they are easy to down-cycle paper bags consume more energy to manufacture compared to plastic bags. All types of reasoning - inductive, deductive, and abductive - are quintessential to dream, design, and validate the solutions.

    Analogous research: This approach allows designers to explore analogous problems with similar characteristics in other domains to gain insights and be inspired. Weight loss programs and alcohol support groups use social levers such as community support, peer pressure, and competition to help change people's behavior. The green social networks such as Carbonrally and Climate Culture are designed to leverage social competition towards green living. Similarly the community support aspect behind the fast growing fitness chain for women, Curve, can be applied to understand the role of community in changing people's behavior. Wiser Earth is an effort in this direction that uses community to connect people with non-profit and businesses to work together towards a sustainable world.

    Researching an analogous domain is even more important when the primary domain such as sustainability does not allow to experiment the solution effectively due to its dry, non-tangible, and emerging nature. When given a task to design an emergency room a few people from IDEO went to a NASCAR race to observe the pit crew to better understand what kind of things can go wrong under emergency and how people respond to those events.

    Empathy: Put yourself in the shoes of the people you are trying to change. As Thomas Friedman says people are having a green party and not a green revolution. Go to these green parties and follow people around to better understand what it will take to turn these parties into a true green revolution. Is it lack of awareness, motivation, or an incentive? Gain empathy for the people and understand their perspective in their context - what will it take socially and economically for them to change their behavior?

    Context is critical for design thinking. Observing and talking to people in their natural environment designers gain empathy for the people and discover behavior patterns that they would have not found had they sat in their offices thinking how they should change people's behavior.

    Holistic multidisciplinary approach: The sustainability efforts span across different culture, countries, background, and belief systems. To successfully solve this problem from the tools, behavior, and policies perspective people from the different disciplines such as engineers, scientists, interaction designers, social scientists, policy makers, and business executives need to come together and collaboratively work on it. Naive, curious, and inclusive mindset allows designers to holistically study the problem from the perspective of all the stakeholders - manufacturers, consumers, policy makers etc. The tools, technology, incentives, and policies, if designed in isolation, leave out gaps and often result into confirmation bias.

    Be tangible and iterate often: This is a daunting problem and boiling the ocean would lead to an analysis paralysis nightmare. This is not a mature domain and there are no certainties around what will work and what won't. The best approach would be to rapidly prototype a solution to get early feedback from the consumers and iterate it often. There has been an ongoing debate on carbon tax versus carbon cap-and-trade. Instead of getting stuck in the controversy, opinions, and abstract ideas there is an opportunity to build something tangible and let people validate their own assumptions. A tangible object against an abstract concept enables better conversations and feedback channels since it is about the solution and not about the problem.

    Focus on journey and emergent experimentation: Design thinking is about thinking in a different way and not about having any specific skills. It focuses on the journey, the method, and not on the outcome. People demand instant gratification but sustainability is not like the biggest looser competition where your weekly weigh-in would tell you where you are. It will take us years before we can actually quantify the impact of sustainability efforts that we are asking people to put in today. It is one of those initiatives that may not see any short term benefits at all. For such initiatives top-down compliance strategy won't work. A good design with emergent experimentation will focus on the journey and not the destination with the iterative results on the way to convince people how a change in their behavior slowly change the world around them. People will believe in the journey and the emergent experimentation.

    Update: John R. Ehrenfeld who is currently serving as an Executive Director of the International Society for Industrial Ecology prior to his career as the Director of the MIT Program on Technology, Business, and Environment, an interdisciplinary educational, research, and policy program has picked up this story and posted on his blog Sustainability by Design.

    Friday, December 19, 2008

    De-coupled Cloud Runtime And Demand-based Pricing Suggest Second Wave Of Cloud Computing

    A couple of days back Zoho announced that the applications created using Zoho Creator can now be deployed on the Google cloud. On the same day Google announced their tentative pricing scheme to buy resources on their cloud beyond the free daily quota. We seem to have entered into the second wave of the cloud computing.

    Many on-demand application vendors, who rely on non-cloud based infrastructure, have struggled to be profitable since the infrastructure cost is way too high. These vendors still have value-based pricing for their SaaS portfolio and cannot pass on the high infrastructure cost to their customers. The first wave of the cloud computing provided a nice utility model to the
    customers who wanted to SaaS up their applications without investing into the infrastructure and charge their customers a fixed subscription. As I observe the second wave of the cloud computing a couple of patterns have emerged.

    Moving to the cloud, one piece at time: The vendors have started moving the runtime to a third party cloud while keeping the design time on their own cloud. Zoho Creator is a good example where you could use it to create applications using Zoho's infrastructure and then optionally use Google's cloud to run it and scale. Some vendors such as Coghead are already ahead in this game by keeping the both, design-time and run-time, on Amazon's cloud. Many design tools that have traditionally been on-premise might stay that way and could help the end users to run part of their code on the cloud or deploy the entire application on the cloud. Mathematica announced their integration with Amazon's cloud where you can design a problem on-premise and send it to the cloud to compute. Nick Carr calls it the cloud as a feature

    Innovate with the demand-based pricing: As the cloud vendors become more and more creative about how their infrastructure is being utilized and introduce demand-based pricing, the customers can innovate around their consumption. Demand-based pricing for the cloud could allow the customers to schedule the non-real time tasks of the applications based on when the computing is cheap. This approach will also make the data centers green since the energy demand is now directly based on computing demand that is being managed by creative pricing. This is not new for the green advocates. The green advocates have long been pushing for a policy change to promote variable-pricing model for the utilities that would base price of electricity on the demand against a flat rate. The consumers can benefit by their appliances and smart meters negotiating with the smart grid to get the best pricing. The utilities can benefit by better predicting the demand and make the generation more efficient and green. I see synergies between the cloud and green IT.

    Monday, December 1, 2008

    Does Cloud Computing Help Create Network Effect To Support Crowdsourcing And Collaborative Filtering?

    Nick has a long post about Tim O'Reilly not getting the cloud. He questions Tim's assumptions on Web 2.0, network effects, power laws, and cloud computing. Both of them have good points.

    O'Reilly comments on the cloud in the context of network effects:

    "Cloud computing, at least in the sense that Hugh seems to be using the term, as a synonym for the infrastructure level of the cloud as best exemplified by Amazon S3 and EC2, doesn't have this kind of dynamic."

    Nick argues:

    "The network effect is indeed an important force shaping business online, and O'Reilly is right to remind us of that fact. But he's wrong to suggest that the network effect is the only or the most powerful means of achieving superior market share or profitability online or that it will be the defining formative factor for cloud computing."

    Both of them also argue about applying power laws to the cloud computing. I am with Nick on the power laws but strongly disagree with him on his view of cloud computing and network effects. The cloud at the infrastructure level will still follow the power laws due to the inherent capital intensive requirements of a data center and the tools on the cloud would help create network effects. Let's make sure we all understand what the powers laws are:

    "In systems where many people are free to choose between many options, a small subset of the whole will get a disproportionate amount of traffic (or attention, or income), even if no members of the system actively work towards such an outcome. This has nothing to do with moral weakness, selling out, or any other psychological explanation. The very act of choosing, spread widely enough and freely enough, creates a power law distribution."

    Any network effect starts with a small set of something and it eventually grows bigger and bigger - users, content etc. The cloud makes it a great platform for such systems that demand this kind of growth. The adoption barrier is close to zero for the companies whose business model actually depends upon creating these effects. They can provision their users, applications, and content on the cloud and be up and running in minutes and can grow as the user base and the content grows. This actually shifts the power to the smaller players and help them compete with the big cloud players and yet allow them to create network effects.

    The big cloud players, that are currently on the supply side of this utility mode, have few options on the table. They either can keep themselves to the infrastructure business and I would wear my skeptic hat and agree with a lot of people on the poor viability of this capital intensive business model that has very high operational cost. This option alone does not make sense and the big companies have to have a strategic intent behind such large investment.

    The strategic intent could be to SaaS up their tools and applications on the cloud. The investment and control over the infrastructure would provide a head start. They can also bring in partner ecosystem and crowdsource large user community to create a network effect of social innovation that is based on collective intelligence which in turn would make the tools better. One of the challenges with the recommendation systems that uses collaborative filtering is to be able to mine massive information that includes users' data and behavior and compute the correlation by linking it with massive information from other sources. The cloud makes a good platform for such requirements due to its inherent ability to store vast amount of information and perform massive parallel processing across heterogeneous sources. There are obvious privacy and security issues with this kind of approach but they are not impossible to resolve.

    Google, Amazon, and Microsoft are the supply side cloud infrastructure players that are already moving in the demand side of the tools business though I would not call them the equal players exploring all the opportunities.

    And last but not the least, there is a sustainability angle around the cloud providers. They can help consolidate thousands of data centers into few hundreds based on the geographical coverage, availability of water, energy, and dark fiber etc. This is similar to consolidating hundreds of dirty coal plants into few non-coal green power plants that can produce clean energy with efficient transmission and distribution system.

    Thursday, November 13, 2008

    Continuous Passive Branding During Economic Downturn To Change Customers' Opinions

    The current economic downturn has forced many CIOs to significantly reduce the external IT spending. Many projects are being postponed or canceled. This situation poses some serious challenges to the sales and marketing people of companies selling enterprise software. Many argue that there is nothing much these people can do. I would disagree.

    Marketing campaigns tend to rely a lot on selling a product using active aggressive marketing that may not be effective under these circumstances since many purchase decisions are being placed on hold. However these circumstances and poor economic climate are ideal to build a brand and paddle concepts with continuous passive branding exercise. The branding exercise, if designed well, could change buyers’ experience around a concept or a product and evoke emotions that could be helpful when a product is actively being sold. Guy Kawasaki points us to an experiment that studied the art of persuasion to change people's attitudes. People should always be selling since the best way to change someone's mind is to sell them when they are not invested into an active purchase decision, emotionally or otherwise.

    GE’s green initiative, branded as ecomagination, is an example of one of these passive branding exercise. Last year Climate Brand Index rated GE No. 1 on green brands. GE published a page long ad in a leading national magazine introducing their new green aviation engine. Instead Jeff could have picked up the phone and called Boing and Airbus and said "hey we have a new engine". Instead GE paddled their green brand to eventually support their other products such as green light bulbs. Climate change is a topic that many people are not necessarily emotionally attached to and have a neutral position on but such continuous passive marketing campaigns could potentially change people's opinions.

    Apple’s cognitive dissonance is also a well known branding strategy to passively convince consumers that a Mac, in general, is better than a Windows. Many people simply didn’t have a stand on a laptop but now given a choice many do believe that they like a Mac.

    The art of persuasion goes well beyond the marketing campaigns. Keeping customers engaged onto the topics and drive the thought leadership is something even more important during this economic downturn. The sales conversation is not limited to selling a product but also includes selling a concept or a need. The marketing is even more important considering the customers are not actively buying anything. The leaders should not fixate themselves on measuring the campaign to lead metrics. Staying with the customers in this downturn and help them extract the maximum value out of their current investment would go a long way since customers don't see their opinions being changed by a seemingly neutral vendor. When the economic climate improves and the customers initiates a purchase that sales cycle is not going to be that long and dry.

    The leaders should carefully evaluate their investment strategy during this economic downturn. The economy will bounce back, the question is will they be ready to leap frog the competition and be a market leader when that happens. Cisco's recently announced their 2009 Q1 results. John Chambers made Cisco's strategy in the downturn very clear - invest aggressively in two geographies: the U.S. and selective emerging countries since emerging countries will be a steady state of growth as the countries grow and be prepared to sell in the western countries since they are likely the first ones to come out of this downturn.

    “In our opinion, the U.S. will be the first major country to recover. The strategy on emerging countries is simple. Over time we expect the majority of the world’s GDP growth will come from the emerging countries. In expanding these relationships during tough times, our goal is to be uniquely positioned as the market turn-around occurs. This is identical to what we did during Asia's 1997 financial crisis.”

    Thursday, October 16, 2008

    Greening The Data Centers

    Recently Google published the Power Usage Efficiency (PUE) numbers of their data centers. PUE is defined as a ratio of the total power consumed by a data center to the power consumed by the IT equipments of the facility. Google's data centers' PUE ranges from 1.1 to 1.3 which is quite impressive. Though it is unclear why all the data centers have slightly different PUE. Are they designed differently or are they all not tuned to improve for the energy efficiency? In any case I am glad to see that Google is committed to the Green Grid initiative and is making the measurement data and method publicly available. This should encourage other organizations to improve the energy performance of their data centers.

    The energy efficiency of a data center can be classified into three main categories:

    1. Efficiency of the facility: The PUE is designed to measure this kind of efficiency that is based on how a facility that hosts a data center is designed such as its physical location, layout, sizing, cooling systems etc. Some organizations have gotten quite creative to improve this kind of efficiency by setting up an underground data center to achieve consistent temperature or setting up data centers near a power generation facility or even setting up their own captive power plant to reduce the distribution loss from the grid and meet the peak load demand.

    2. Efficiency of the servers: This efficiency is based on the efficiency of the hardware components of the servers such as CPU, cooling fans, drive motors etc. has made significant progress in this area to provide energy-efficient solutions. Sun has backed up the organization OpenEco that helps participants assess, track, and compare energy performance. Sun has also published their carbon footprint.

    3. Efficiency of the software architecture: To achieve this kind of efficiency the software architecture is optimized to consume less energy to provide the same functionality. The optimization techniques have by far focused on the performance, storage, and manageability ignoring the software architecture tuning that brings in energy efficiency.

    Round Robbin is a popular load balancing algorithm to optimize the load on servers but this algorithm is proven to be energy in-efficient. Another example is about the compression. If data is compressed on a disk it would require CPU cycles to uncompress it versus requiring more I/O calls if it is stored uncompressed. Given everything else being the same, which approach would require less power? These are not trivial questions.

    I do not favor an approach where the majority of the programmers are required to change their behavior and learn new way of writing code. One of the ways to optimize the energy performance of the software architecture is to adopt an 80/20 rule. The 80% of the applications use 20% of the code and in most of the cases it is an infrastructure or middleware code. It is relatively easy to educate and train these small subset of the programmers to optimize the code and the architecture for energy-efficiency. Virtualization could also help a lot in this area since the execution layers can be abstracted into something that can be rapidly changed and tuned without affecting the underlying code to provide consistent functionality and behavior.

    The energy efficiency cannot be achieved by tuning things in separation. It requires a holistic approach. PUE ratios identify the energy loss prior to it reaches a server, the energy-efficient server requires less power to execute the same software compared to other servers, and the energy-efficient software architecture actually lowers the consumption of energy for the same functionality that the software is providing. We need to invest into all the three categories.

    Power consumption is just one aspect of being green. There are many other factors such as how a data center handles the e-waste, the building material used, the green house gases out of the captive power plant (if any) and the cooling plants etc. However tackling energy efficiency is a great first step in greening the data centers.
  • 同乐城tlc88官方网站