How DevOps oils the Value Chain

DevOps ? a clipped compound of development and operations – is a way of working whereby software developers are in a team with project beneficiaries. A client centred approach extends the project plan to include the life cycle of the product or service, for which the software is developed.

We can then no longer speak of a software project for say Joe?s Accounting App. The software has no intrinsic value of its own. It follows that the software engineers are building an accounting app product. This is a small, crucially important distinction, because they are no longer in a silo with different business interests.

To take the analogy further, the developers are no longer contractors possibly trying to stretch out the process. They are members of Joe?s accounting company, and they are just as keen to get to market fast as Joe is to start earning income. DevOps uses this synergy to achieve the overarching business goal.

A Brief Introduction to OpsDev

You can skip this section if you already read this article. If not then you need to know that DevOps is a culture, not a working method. The three ?members? are the software developers, the beneficiaries, and a quality control mechanism. The developers break their task into smaller chunks instead of releasing the code to quality control as a single batch. As a result, the review process happens contiguously along these simplified lines.

Code QC Test ? ? ?
? Code QC Test ? ?
? ? Code QC Test ?
? ? ? Code QC Test
Colour Key Developers Quality Control Beneficiary

This is a marked improvement over the previously cumbersome method below.

Write the Code ? Test the Code ? Use the Code
? Evaluate, Schedule for Next Review ?

Working quickly and releasing smaller amounts of code means the OpsDev team learns quickly from mistakes, and should come to product release ahead of any competitor using the older, more linear method. The shared method of working releases huge resources in terms of user experience and in-line QC practices. Instead of being in a silo working on its own, development finds it has a richer brief and more support from being ?on the same side of the organisation?.

The Key Role that Application Program Interfaces Play

Application Program Interfaces, or API?s for short, are building blocks for software applications. Using proprietary software-bridges speeds this process up. A good example would be the PayPal applications that we find on so many websites today. API?s are not just for commercial sites, and they can reduce costs and improve efficiency considerably.

The following diagram courtesy of TIBCO illustrates how second-party applications integrate with PayPal architecture via an API fa?ade.

Working quickly and releasing smaller amounts of code means the OpsDev team learns quickly from mistakes, and should come to product release ahead of any competitor using the older, more linear method. The shared method of working releases huge resources in terms of user experience and in-line QC practices. Instead of being in a silo working on its own, development finds it has a richer brief and more support from being ?on the same side of the organisation?.

imgd2.jpg

The DevOps Revolution Continues ?

We close with some important insights from an interview with Jim Stoneham. He was general manager of the Yahoo Communities business unit, at the time Flickr became a part. ?Flickr was a codebase,? Jim recalls, ?that evolved to operate at high scale over 7 years – and continuing to scale while adding and refining features was no small challenge. During this transition, it was a huge advantage that there was such an integrated dev and ops team?

The ?maturity model? as engineers refer to DevOps status currently, enables developers to learn faster, and deploy upgrades ahead of their competitors. This means the client reaches and exceeds break-even sooner. DevOps lubricates the value chain so companies add value to a product faster. One reason it worked so well with Flickr, was the immense trust between Dev and Ops, and that is a lesson we should learn.

?We transformed from a team of employees to a team of owners. When you move at that speed, and are looking at the numbers and the results daily, your investment level radically changes. This just can’t happen in teams that release quarterly, and it’s difficult even with monthly cycles.? (Jim Stoneham)

Contact Us

  • (+353)(0)1-443-3807 – IRL
  • (+44)(0)20-7193-9751 – UK

Check our similar posts

Telemetry and the Survival of the Human Species

Without moisture, plants die. Without fodder, the animal food chain collapses. This is why climate change is the greatest threat humankind faces. Crop management needs timely information regarding ambient conditions, and also in the soil itself. In dry areas, online knowledge of trends in rainfall, sunlight, wind speed, leaf moisture, air temperature, relative humidity and solar radiation are indicators of soil stress that can be deadly for plants, and everything that relies on them.

As climate change bites, the need to find solutions accelerates. Drones swoop across to monitor ambient conditions, while probes sunk into plants and the earth in which they grow transmit information to big data repositories for feedback to administrators. In Australia, a remarkable cattle farmer is applying the same approach to his herds.

Nuffield scholar Rob Cook has always been on the edgy side of things. He lost his mobility in a helicopter crash in 2008 patrolling farmland but that has not deterred him. If anything, it has freed his mind to explore the potential that telemetry offers farmers in Australia. He shared this potential with the young beef producers in Roma Australia recently, and here is a summary what he said.

Being wheelchair bound he had to shift from herding with cattle dogs to a more scientific approach. He bought a farm 230 miles / 370 kilometres inland from Brisbane in a warm, temperate climate with significant rainfall even in the driest months. He uses observant software that reports on critical issues like water levels indicating animal consumption, and supplementary water flows from a central irrigation channel.

He also monitors fodder sources for dryer months, and moisture levels in food stocks. Rob is committed to making every blade of grass count. ?We even have the ability to take a photo of the cattle when they are taking a drink of water,? he explains, and that provides valuable information regarding tick and fly infestation and overall condition.

None of this would be possible for Rob Cook without telemetry, which is the process of collecting data at remote points and transmitting it to receiving equipment for analysis. Independent farmers do not have equipment to fund these analytic resources on their own, and use big data resources in a cloud to obtain reports. ecoVaro is on top of current trends. Please speak to us when you need independent advice.

?

Knowing the Caveats in Cloud Computing

Cloud computing has become such a buzzword in business circles today that many organisations both small and large, are quick to jump on the cloud bandwagon – sometimes a little too hastily.

Yes, the benefits of the cloud are numerous: reduced infrastructure costs, improved performance, faster time-to-market, capability to develop more applications, lower IT staff expenses; you get the picture. But contrary to what many may be expecting or have been led to believe, cloud computing is not without its share of drawbacks, especially for smaller organisations who have limited knowledge to go on with.

So before businesses move to the cloud, it pays to learn a little more about the caveats that could meet them along the way. Here are some tips to getting started with cloud computing as a small business consumer.

Know your cloud. As with anything else, knowledge is always key. Because it is a relatively new tool in IT, it’s not surprising that there is some confusion about the term cloud computing among many business owners and even CIOs. According to the document The NIST Definition of Cloud Computing, cloud computing has five essential characteristics, three basic service models (Saas, Paas and Iaas), and four deployment models (public, community, private and hybrid).

The first thing organisations should do is make a review of their operations and evaluate if they really need a cloud service. If they would indeed benefit from cloud computing, the next steps would be deciding on the service model that would best fit the organisation and choosing the right cloud service provider. These factors are particularly important when you consider data security and compliance issues.

Read the fine print. Before entering into a contract with a cloud provider, businesses should first ensure that the responsibilities for both parties are well-defined, and if the cloud vendor has the vital mechanisms in place for contingency measures. For instance, how does the provider intend to carry out backup and data retrieval operations? Is there assurance that the business’ critical data and systems will be accessible at all times? And if not, how soon can the data be available in case of a temporary shutdown of the cloud?

Also, what if either the company or the cloud provider stops operations or goes bankrupt? It should be clear from the get go that the data remains the sole property of the consumer or company subscribing to the cloud.

As you can see, there are various concerns that need to be addressed closely before any agreement is finalised. While these details are usually found in the Service Level Agreements (SLAs) of most outsourcing and servicing contracts, unfortunately, the same cannot be said of cloud contracts.

Be aware of possible unforeseen costs. The ability of smaller companies to avail of computing resources on a scalable, pay-as-you-go model is one of the biggest selling points of cloud computing. But there’s also an inherent risk here: the possibility of runaway costs. Rather than allowing significant cost savings, small businesses could end up with a bill that’s bound to blow a big hole in their budget.

Take for example the case of a software company cited on InformationWeek.com to illustrate this point. The 250-server cluster the company rented from a cloud provider was inadvertently left turned on by the testing team over the weekend. As a result, their usual $2,300 bill ballooned to a whopping $23,400 over the course of one weekend.

Of course, in all likelihood, this isn’t going to happen to every small and midsize enterprise that shifts to the cloud. However, this should alert business owners, finance executives, and CEOs to look beyond the perceived savings and identify potential sources of unexpected costs. What may start as a fixed rate scheme for on-demand computing resources, may end up becoming a complex pricing puzzle as the needs of the business grow, or simply because of human error as the example above shows.

The caveats we’ve listed here are among the most crucial ones that soon-to-be cloud adopters need to keep in mind. But should these be reasons enough for businesses to stop pursuing a cloud strategy? Most definitely not. Armed with the right information, cloud computing is still the fastest and most effective way for many small enterprises to get the business off the ground with the lowest start-up costs.

Contact Us

  • (+353)(0)1-443-3807 – IRL
  • (+44)(0)20-7193-9751 – UK
ISO in Energy management

Every industry has its own set levels of quality that are considered acceptable or desirable. Energy performance like any other field is governed by some set standards. These differ across regions but international standards do exist.

ISO 50001 is the international energy standard applicable to both large and small organisations irrespective of geographical, cultural or social conditions. It outlines the best energy management practices that are considered to be the best by specifying that an organisation must integrate an energy management system and institute an energy policy, objectives, targets, and action plans taking into account legal requirements and information related to significant energy use. The energy standard is applicable to organisations.

What’s the importance of attaining energy certification?

ISO certification in any industry is a demonstration of quality or that a service or product meets the expected service standards. In energy management, ISO certification is a demonstration that an organisation or company has implemented sustainable energy management systems, completed a baseline of energy use and, is committed to continuously improve its energy performance. In addition, ISO certification assists organisations in the following ways:

? Organisations are able to optimise the existing energy-consuming assets

? Offers guidance on bench-marking, measuring, documenting, and reporting energy intensity improvements and their projected impact on reducing GHG emissions

? Creates transparency and facilitates communication on the management of energy resources

? Promotes energy management best practices and reinforces good energy management behaviours

? Assists facilities in evaluating and prioritising the implementation of new energy-efficient technologies

? Provides a framework for promoting energy efficiency throughout the supply chain

? Facilitates energy management improvements in the context of GHG emission reduction projects: The reduction of carbon emissions means therefore an organisation is able to meet government carbon reduction targets by demonstrating environmental credentials. The accruing benefits are many, ranging from increased investor confidence to more tender opportunities

Energy management software plays a vital role in helping organisations comply with energy standards through improved performance across the various functions in an organisation.

Ready to work with Denizon?