UGC NET NTA Exam Preparation:Data Interpretation|Unit-7|Part-1|

 UGC NET NTA Exam Preparation:Data Interpretation |Unit-7|Part-1|

Data:

Information in raw or unorganised form (such as alphabets, numbers, or symbols) that refers to or represent conditions, ideas or object. for e.g., marks of students in a class, temperature of various places in the country etc is called 'Data'

Data Acquisition: 

data acquisition is the process of gathering and measuring information on targeted or desired variables, which enables one to evaluate outcomes and answer relevant questions. it is a component of research in all fields of study including physical and social sciences, humanities and business.

The goal for all data collection is to capture quality evidence that allows analysis which lead to the formulation of convincing and credible answers to the questions that have been posed.

a formal data acquisition process is necessary as it ensures that the data gathered are both defined and accurate and that subsequent decisions based on arguments present in the findings are valid.

Sources of data:

normally we can gather data from 2 sources namely primary and secondary

1. Primary data source: primary data is the data that you gather considering your analysis needs. the source of primary data is a population place from which you gather information in the form of questionnaire focusing on targeted size and kind of sample.

2. Secondary data source: secondary data is the data acquired from optional sources like magazines, books, documents,journals,reports, the web and more.

Classification of data:

1. Qualitative data:
 This type of data is basically descriptive in nature and a bit difficult to analyse. this type of data can be classified into categories, on the basis of physical attributes and properties of the object. it is concerned with the data that is observable in term of small, appearance,taste,feel,texture,gender etc. the method of collecting qualitative data are
1. Focus group
2. Observation
3. Interview

2. Quantitative data:
This type of data basically deals with quantity or numbers. if refers to the data which computes the values and can express it in numerical terms. it can be used in computation and statistical test. it is concerned with measurements like height, weight, volume, length, size etc. the methods of collecting quantitative data
1. surveys
2. Experiments
3. Observations and interview

Data mapping:

data mapping is defined as the relationship between two data systems. data mapping connects two different types of data models together. it is a process of finding out as how an application or database connects to another application or database.

for e.g., suppose there are 2 different databases, out of which one contains the list of names of persons in an organisation and other contains the list of their phone numbers. now, if we wish to connect both of the lists together, we will take the help of data mapping applications or software.

Data mapping Task

All the tasks under data mapping fall into 2 categories. these are

1. Data Migration:

This process focuses on mainly moving information from one data model to another data model. it creates a way b/w source data model and the destination data model.
this way is facilitated by data mapping s/w which makes the migration process easy and efficient.
for e.g., suppose, if we want to move all the data that contains contact no., text messages and photos from one mobile to another, we use a s/w or a feature like Bluetooth to get the job done. this is how data migration works.

2. Data Integration:

This process mainly focuses on new information. it creates a way b/w a new data model and an old data model and makes them connected to each other. by connecting both of the data models it becomes easier to access data from both of the sources.

Data mapping S/W: 

The data mapping s/w usually works as a converter. it converts the data into electronic data interchange file format. so, the incoming data can be matched with the existing data. the s/w for mapping data usually transform information b/w systems in an efficient way. it can transform any field data into specific length, data types, or formats that the receiver system needs. it also drops the redundant or irrelevant data fields and combine the elements into a new field.
some of the data mapping s/w are
  • Adeptia
  • HVR
  • Alooma
  • IBM Infosphere
  • Dell Boomi

Data Governance:

Data governance is a four way framework comprising availability, applicability, integrity and security. it is a set of processes, used by the stakeholders who use technology, to ensure that the important and critical data is managed and protected.
it involves a streamlined coordination for individuals, methods and innovation (technology) in such an order that it results in realising the value of data for any organisation. it has to be implemented as a disciplined work flow within the organisation, without that disciplines data would never be treated as a valuable commodity.

Importance of Data Governance:

data governance is required to ensure that an organisations information assets are formly, properly, proactively and efficiently managed throughout the enterprise to secure its trust and accountability data governance comprises the collecting of data, revising and standardising it and making it good for use. it makes data consistent.
it ensures that critical data is available at the right time to the right person, in a standardised and reliable form. adopting and implementing data governance can result in improved productivity and efficiency of an organisation.

Tools of Data Governance:

It is becoming very important to monitor business data for hacking and controlling it to meet regulatory standards. A quality data governance program should include a governing body that defines procedures and creates an executable plan. data governance tools helps you to handle your data and ensure it to meet regulations.
Some of the data governance tools available in the market are as follows:
  • TAG INSPECTOR
  • AGILITY
  • ASIGRA
  • ACAVEO
  • A.K.A
  • CLEARSWIFT

Data interpretation:

It is a process of analysing and calculating the given or present data to obtain a desired result or reaching a conclusion. general methods used in DI processes are data table, pie chart, line chart etc.

Post a Comment

0 Comments