8 Reasons why you Need to Undertake Technical and Application Assessments

Are your information assets enabling you to operate more cost-effectively or are they just drawing in more risks than you are actually aware of? Obviously, you now need to get a better picture of those assets to see if your IT investments are giving you the benefits you were expecting and to help you identify areas where improvements should be made.

The best way to get the answers to those questions is through technical and application assessments. In this post, we?ll identify 8 good reasons why it is now imperative to undertake such assessments.

1. Address known issues – Perhaps the most common reason that drives companies to undertake a technology/application assessment is to identify the causes of existing issues such as those related to data accessibility, hardware and software scalability, and performance.

2. Cut down liabilities and risks – Unless you know what and where the risks are, there is no way you can implement an appropriate risk mitigation strategy. A technology and application assessment will enable you to thoroughly test and examine your information systems to see where your business-critical areas and points of failure are and subsequently allow you to act on them.

3. Discover emerging risks – Some risks may not yet be as threatening as others. But it would certainly be reassuring to be aware if any exist. That way, you can either nip them in the bud or keep them monitored.

4. Comply with regulations – Regulations like SOX require you to establish adequate internal controls to achieve compliance. Other regulations call for the protection of personally identifiable information. Assessments will help you pinpoint processes that lack controls, identify data that need protection, and areas that don’t meet regulatory requirements. This will enable you to act accordingly and keep your company away from tedious, time-consuming and costly sanctions.

5. Enhance performance – Poor performance is not always caused by an ageing hardware or an overloaded infrastructure. Sometimes, the culprits are: unsuitable configuration settings, inappropriate security policies, or misplaced business logic. A well-executed assessment can provide enough information that would lead to a more cost-effective action plan and help you avoid an expensive but useless purchase.

6. Improve interoperability – Disparate technologies working completely separate from each other may be preventing you from realising the maximum potential of your entire IT ecosystem. If you can examine your IT systems, you may be able to discover ways to make them interoperate and in turn harness untapped capabilities of already existing assets.

7. Ensure alignment of IT with business goals – An important factor in achieving IT governance is the proper alignment of IT with business goals. IT processes need to be assessed regularly to ensure that this alignment continues to exist. If it does not, then necessary adjustments can be made.

8. Provide assurance to customers and investors – Escalating cases of data breaches and identity theft are making customers and investors more conscious with a company?s capability of preserving the confidentiality of sensitive information. By conducting regular assessments, you can show your customers and investors concrete steps for keeping sensitive information confidential.

Check our similar posts

Are Master Data Management and Hadoop a Good Match?

Master Data is the critical electronic information about the company we cannot afford to lose. Accordingly, we should sanitise it, look after it, and store it safely in several separate places that are independent of each other. The advent of Big Data introduced the current era of huge repositories ?in the clouds?. They are not, of course but at least they are remote. This short article includes a discussion about Hadoop, and whether this is a good platform to back up your Master Data.

About Hadoop

Hadoop is an open-source Apache software framework built on the assumption that hardware failure is so common that backups are unavoidable. It comprises a storage area and a management part that distributes the data to smaller nodes where it processes faster and more efficiently. Prominent users include Yahoo! and Facebook. In fact more than half Fortune 50 companies were using Hadoop in 2013.

Hadoop – initially launched in December 2011 ? has survived its baptism of fire and became a respected, reliable option. But is this something the average business owner can tackle on their own? Bear in mind that open source software generally comes with little implementation support from the vendor.

The Hadoop Strong Suite

  • Free to download, use and contribute to
  • Everything you need ?in the box? to get started
  • Distributed across multiple fire-walled computers
  • Fast processing of data held in efficient cluster nodes
  • Massive scaleable storage you are unlikely to run out of

Practical Constraints

There is more to Hadoop than writing to WordPress. The most straightforward solutions are uploading using Java commands, obtaining an interface mechanism, or using third party vendor connectors such as ACCESS or SAS. The system does not replace the need for IT support, although it is cheap and exceptionally powerful.

The Not-Free Safer Option

Smaller companies without in-depth in-house support are wise to engage with a technical intermediary. There are companies providing commercial implementations followed by support. Microsoft, Amazon and Google among others all have commercial versions in their catalogues, and support teams at the end of the line.

The Rights of Individuals Under The General Data Protection Regulation

The General Data Protection Regulation or GDPR is a European Union law reinforcing the rights of citizens concerning the confidentiality of their information, and confirming that they own it. We thought it would be interesting to examine the GDPR effective 25 May 2018 from an Irish citizen?s perspective. This article is a summary of information on the Data Protection Commissioner?s website, but as viewed through a businessperson?s lens.

How the Office Defines Data Protection

The Office believes that organisations receiving personal details have a duty to keep them private and safe. This applies inter alia to information that individuals supply to government, financial institutions, insurance companies, medical providers, telecoms services, and lenders. It also applies to information provided when they open accounts.

This information may be on paper, on computers, or in video, voice, or photographic records. The true owners of this information, the individuals have a right:

  • To make sure that it is factually correct
  • To the assurance that it is shared responsibly
  • That all with access only use it for stated purposes

Any organisation requesting personal information must state who they are, what the information is for, why they need to have it, and to whom else they may provide it.

Consumer Rights to Access Their Personal Information

Private persons have a right under the GDPR to a copy of all their information held or processed by a business. The regulation refers to such businesses as ?data controllers? as opposed to owners, which is interesting. They have to provide both paper and digital data, and ‘related information?.

Data controller fees for this are discretionary within limits. The request may be denied under certain circumstances. The data controller may release information about children to parents and guardians, only if it considers a minor too young to understand its significance. Other third parties such as attorneys must prove they have consent.

Consumer Rights to Port Their Data to Different Services

Since the personal information belongs to the individual, they have a right not only to access it, but also to copy or move it from one digital environment to another. The GDPR requires this be ?in a safe way, without hindrance to usability?. An application could be a banking client that wants to upload their transaction history to a third party price comparison website.

However, the right to data portability only applies to data originally provided by the consumer. Moreover, an automated method must be available for porting. Data controllers must release the information in an open format, and may not charge for the porting service.

Consumer Rights to Complain About Personal Data Abuse

Individuals have a right under the General Data Protection Regulation to have their information rectified if they discover errors. This right extends to an assurance that third parties know about the changes – and who these third party entities are. Data controllers must respond within one month. If they decline the request, they must inform the complainant of their right to further remedial action.

If a data controller refuses to release personal information to the owner, or to correct errors, then the Data Protection Office has legal power to enforce the consumer?s rights. The complainant must make full disclosure of the history of their complaint, and the steps they have taken themselves to attempt to set things right.

Further Advice on Getting Things Ready for 25 May 2018

The General Data Protection Regulation has the full force of law from 25 May 2018 onward, and supersedes all applicable Irish laws, regulations, and policies from that date. We recommend incorporating rights of data owners who are also your customers into your immediate plans. We doubt that forgetting to do so will cut much sway with the Data Commissioner. Remember, you have one month to respond to consumer requests, and only one more month to close things out subject to the matter being complex.

How DevOps oils the Value Chain

DevOps ? a clipped compound of development and operations – is a way of working whereby software developers are in a team with project beneficiaries. A client centred approach extends the project plan to include the life cycle of the product or service, for which the software is developed.

We can then no longer speak of a software project for say Joe?s Accounting App. The software has no intrinsic value of its own. It follows that the software engineers are building an accounting app product. This is a small, crucially important distinction, because they are no longer in a silo with different business interests.

To take the analogy further, the developers are no longer contractors possibly trying to stretch out the process. They are members of Joe?s accounting company, and they are just as keen to get to market fast as Joe is to start earning income. DevOps uses this synergy to achieve the overarching business goal.

A Brief Introduction to OpsDev

You can skip this section if you already read this article. If not then you need to know that DevOps is a culture, not a working method. The three ?members? are the software developers, the beneficiaries, and a quality control mechanism. The developers break their task into smaller chunks instead of releasing the code to quality control as a single batch. As a result, the review process happens contiguously along these simplified lines.

Code QC Test ? ? ?
? Code QC Test ? ?
? ? Code QC Test ?
? ? ? Code QC Test
Colour Key Developers Quality Control Beneficiary

This is a marked improvement over the previously cumbersome method below.

Write the Code ? Test the Code ? Use the Code
? Evaluate, Schedule for Next Review ?

Working quickly and releasing smaller amounts of code means the OpsDev team learns quickly from mistakes, and should come to product release ahead of any competitor using the older, more linear method. The shared method of working releases huge resources in terms of user experience and in-line QC practices. Instead of being in a silo working on its own, development finds it has a richer brief and more support from being ?on the same side of the organisation?.

The Key Role that Application Program Interfaces Play

Application Program Interfaces, or API?s for short, are building blocks for software applications. Using proprietary software-bridges speeds this process up. A good example would be the PayPal applications that we find on so many websites today. API?s are not just for commercial sites, and they can reduce costs and improve efficiency considerably.

The following diagram courtesy of TIBCO illustrates how second-party applications integrate with PayPal architecture via an API fa?ade.

Working quickly and releasing smaller amounts of code means the OpsDev team learns quickly from mistakes, and should come to product release ahead of any competitor using the older, more linear method. The shared method of working releases huge resources in terms of user experience and in-line QC practices. Instead of being in a silo working on its own, development finds it has a richer brief and more support from being ?on the same side of the organisation?.

imgd2.jpg

The DevOps Revolution Continues ?

We close with some important insights from an interview with Jim Stoneham. He was general manager of the Yahoo Communities business unit, at the time Flickr became a part. ?Flickr was a codebase,? Jim recalls, ?that evolved to operate at high scale over 7 years – and continuing to scale while adding and refining features was no small challenge. During this transition, it was a huge advantage that there was such an integrated dev and ops team?

The ?maturity model? as engineers refer to DevOps status currently, enables developers to learn faster, and deploy upgrades ahead of their competitors. This means the client reaches and exceeds break-even sooner. DevOps lubricates the value chain so companies add value to a product faster. One reason it worked so well with Flickr, was the immense trust between Dev and Ops, and that is a lesson we should learn.

?We transformed from a team of employees to a team of owners. When you move at that speed, and are looking at the numbers and the results daily, your investment level radically changes. This just can’t happen in teams that release quarterly, and it’s difficult even with monthly cycles.? (Jim Stoneham)

Contact Us

  • (+353)(0)1-443-3807 – IRL
  • (+44)(0)20-7193-9751 – UK

Ready to work with Denizon?