Wednesday 12 November 2014


Big Data is the term used for huge quantities of collected data and the methods used to analyse and make sense of it all.  The problem really arose when the Sloane Digital Sky Survey began its redshift study in 2001 and started to collect data from the whole sphere of space surrounding us.  Within a few weeks, it had acquired more information on the stars it observed than had ever been collected in the history of astronomy and the team running it were at a loss as to how to deal with the information. The Large Hadron Collider at CERN in Switzerland has 150 million sensors that collect data 40 million times every second in its attempts to find new and exotic sub-atomic particles and understand how matter is made up.  Both of these examples illustrate how the world of data collection and analysis has to adapt to accommodate increasingly large amounts of information, and the challenges that it represents.

With sizable chunks of information now becoming commonplace in science, medicine, Defence, and manufacturing, industries have to change how they collect and analyse data that they are collecting. This is a factor that is receiving increasing attention from the Oil & Gas industries where exploration either on land or undersea, may require the collection and sifting of huge amounts of data in order to pinpoint potential new resources.  Big Data can be characterised by a number of specific features;

Volume – Relating to the quantity of data produced, the volume is the size variable that relates to the overall size of the collected information.  It is estimated that US consumer giant Walmart deals with over a million customer transactions every hour and runs stock and consumer information databases that are in the order of two and a half petabytes, or 2560 Terabytes, of information in size, which is a pretty big volume to keep on top of.
Variety – Variety is used to catergorise the received information and may relate to database entries, chemical data, statistical information or physical aspects such as size, position, movement etc. The greater the variety, the more analysis that needs to be carried out to reconcile the information and understand it.
Velocity – This parameter dictates the speed with which the data is amassed.  While some data sets may be large they may have been gathered over many years, others may be obtained as a huge mass of data that then requires analysis.  It is expected that the Large Synoptic Survey Telescope will amass 140 terabytes of information every five days, all of which will need to be sifted and catergorised.
Veracity – This refers to the quality of the obtained information.  A geological survey may capture many terabytes of information regarding the make-up of rock formations and gas analysis data, but if it is of insufficiently quality then there is little point in analysing it. Data needs to be of the best quality, which unfortunately tends to increase its size.

Added to these factors is another called “Complexity” which dictates how data management and analysis can become a very intensive process, especially when large volumes of data come from multiple sources and at high velocity. Big Data analysis is now being utilised in an increasing number of fields and experts try to understand the data coming in from many different sources.  Knowing something is one thing but actually understanding what the data is telling you is key to getting the most from it, and helping us advance in many areas.

Written exclusively for Enigma Consulting Group

Check out more of what we can do for you at www.enigma-cg.com

Thursday 6 November 2014



We are finally on news! Mediu News also known as Al-Madinah International University News Center covered the GST training organized by us which were held at Corus Hotel, Kuala Lumpur. The training took place starting from 27 till 30 October 2014 and was a great success as it happened to draw many delegates that eager to learn the comprehensive detail on the GST system which will take effect starting this 2015. Among them are the delegates from Al-Madinah International University and they did a great coverage on the event which were posted on their news site recently. Below we quoted the article and to all the readers, happy reading:


Al-Madinah International University (MEDIU) participated in the HRDF training program ‘Good & Service Tax (GST)’ on Wednesday-Thursday, 29-30 October 2014 in Corus Hotel, Kuala Lumpur. The training session entitled ‘GST for Accounting Treatment & Effective Implementation Process’ was attended by MEDIU’s Finance Division; Puan Raja Roslawati Raja Ismail, Chief Finance Officer and Puan Zuraidah Othman, Head of Finance Department. 
The program was organised by ENIGMA Consulting Group and the speaker was Madam Tiffany Tang; GST Trainer for Malaysia Institute of Accountant (MIA) with vast experience more than 15 years in a wide range of roles in finance industry and GST/VAT practice across few Asian countries. Among the objectives of the 2 days training program are to understand and practice the various GST treatment for input and output tax, setup of GST ledger-sub ledger and GST coding, walkthrough the process of GST implementation as well the documentation of process flowchart for audit trail and assessment purpose. 
The training course equipped the participants on the various scope of GST namely the registration of GST, the pre implementation assignment & action plan, actual implementation observation, tax invoice & record keeping as well as the audit & assessment compliance.

Source:  http://news.mediu.edu.my/2014/10/training-good-service-tax/