Our Clientele

Home » Our Clientele

 

dataIntergrationNAPIC is a property information centre under the ambit of the Director General of Valuation and Property Services.

Napic mission are:

arrowTo supply up-to-date, comprehensive and quality property data

arrowTo publish products that are timely and relevant to the needs of the property industry

arrowTo provide a true assessment of the supply and demand of property in the country

NAPIC’s role as the centre of information for national property information requires it to be responsive to the needs of the Government, as well as other stakeholders in the property market. NAPIC needs to provide faster and timely information turnaround to meet the increasing demand for pro-active responses, accuracy and timeliness of information pertaining to supply and demand of property market.

The PRISM-JPPH has been identified as the system to enable NAPIC to achieve its objective to collect more relevant, accurate, timely and comprehensive data with systematically structured and analysed data as well to improvise the method, speed of producing information resulting in creation of uniformity in terms of interpretation and data classification of data presentation. The scope of work for the PRISM-JPPH enhancement project includes:

  1. PRISM enhancements for PIMS, DMPS, ECMS, EIRS, BIAS and PDWS modules.
  2. Malaysia House Price Index Rebase (MHPI-Rebase)
  3. Purpose Built Office Rental Index (PBORI)
  4. The development and Implementation of the MICE (Matching, Imputation, Cleansing, and Editing) module scope as below:


Integration of data exchange

In addition to the operational PRISM to collect data, NAPIC also depends on the other agencies or external data providers such as Tenaga Nasional Berhad (TNB), Lembaga Hasil Dalam Negeri (LHDN), Kementerian Perumahan dan Kerajaan Tempatan (KPKT) and Pihak Berkuasa Tempatan (PBT) to provide data. The capabilities of exchanging data electronically using file exchange via batch processing, it allows better opportunities for better data management.

Data Quality Management
Determination and management of dimensions of the data quality based on the standardization and metadata determination which includes the processes of data quality programme, review of data architecture and required steps and rules for matching, cleansing and impute.

Data matching, cleansing and editing
By using the SAS Data Management tools and scripts, the data would be matched, enrich and edited based on the rules defined in the data quality management scope.

Master/Reference Data Management of the property information
A central master data and reference management system would provide the standardization (for example, the interfaces to the external data providers can be received) as the master data are centrally managed.

 

image1

 

image2

 

Value Proposition:

A data governance setup where a single unit suggested being setup to concentrate on data and information.

A formalized data stewardship will allow for improved oversight of data subject area, e.g. property or projects where it gives the benefits of:

  • Provides a central point of contact (coordinating data steward) – improving collaboration, business/IT alignment
  • Identifies opportunities for reuse (data, metadata, systems or data modelling)
  • Improve accountability for the data subject area.
  • Improved data quality through ongoing monitoring of data quality in the subject area.
  • Improved standardization in naming and definition of data resources.

Data quality management improvement where data are measured and assess either from internal or external data sources and make recommendations for correction. The data quality can be continually and proactively monitored using the SAS data management tools to reduce the instances of reactive and costly responses.

  • Accuracy – Users need a certain degree of exactness and precision, especially for policy makers and investors, who normally place a high premium on accuracy.
  • Consistency – Consistency of data over time is required for research and forecasting purposes. Realignment of data due to delayed submission, change of usage or operation error by data providers/ data collector will yield more accurate data.
  • Coherent – The degree to which statistical data series derived from a single statistical survey/ program are logically consistent with related statistics both over time and conceptually.

The emphasis of data integration is with the notion of improving the speed, efficiency and quality of gathering data from the external data provider. The SAS data integration tool uses the metadata layer to hide the complexities from the end user where it allows a query to simultaneous data sources and makes it easy to enable data integration and data sharing.

businessIntelligence

The nature of business of the clients is the handling of massive data sources for analytical, consolidation, research, and rating purposes The sources of data are originated from multiple formats and/or scattered data elements TechnoDex derived a standard extraction and upload application for the management of databases.  To develop a data delivery platform that will be a single and an integrated source of database and information for D&B to sell all types of reports to all their clients as well as create more new product lines to increase revenue.

This is a web-based platform portal provide leading source of business information on commercial & personal entities.  Integrated Credit Management Solution enables pre-screening, prospecting and enhances portfolio management and recovery.  All-in-one credit assessment portal enable quick access to credit risk information on both companies and individuals such as Basic Company Registration Information, Litigation Searches, Financial Reports, Payment Details Consumer Credit reports and etc.

Import / Data Management Module: Allow user to upload the data from various source and format. All these data will be consolidate and store in a central repository for later report generation.

Formulated pre-process scoring: Several scheduler / scoring engine which will run routinely to calculate/generate some useful rating / ratio based on the data in the repository.

Online Reporting Module: Online reporting module will generate report for different level, from very basic ROC (register company data) data, to comprehensive analysis data like scoring and graph/chart.

B2B Online Report: This function is to cater batch processing for heavy usage corporate user. User able to send web service request based on the provided technical specification and report from the respond.

Offline Reporting Module: Offline reporting module will handle the full flow of offline report (report that need additional research from operation staff and compose). From request, workflow assignment, to delivery. All delivered report will store in the server and will resell on the same platform.

Credit Monitoring Module: Monitoring engine which will assist user to monitor on set ID list. If there is any changes on the monitored ID, the user will be alert via Email.

Value Proposition:

  • To provide an efficient and effective ways to maintain and enhance data accuracy and validity.
  • Speedy data retrieval and updates
  • Reduce data loss during data migration
  • Enhance data integrity that enable flexible extraction and retrieval
  • Created a data platform for future application integration

 

 

dataCleansing_migrationDeveloped a web based Data Cleansing and Migration Tool that can perform data cleansing & migration from one data source (database / document i.e. MySQL, Oracle, CSV) into another data source.  User can configure the business rule for data cleansing / migration, create & execute the migration/cleansing jobs, monitor the job progress on dashboard, correct the exception data, and view the statistical reports.

To cleanse & migrate PERKESO data from AS/400 DB2 and CSV files of various branches, and merged into centralized MySQL database.

Services:

  • Study & analyze existing AS/400 DB2 data structure and propose database structure to-be.
  • Build a generic data migration & cleansing engine to migrate data from AS/400 DB2 to MYSQL, which can be re-used for all PERKESO sites.
  • Convert MODCA file (image files) to standard format (TIFF).
  • Address harmonization/cleansing.
  • Data included:
  1. Registration of employee & employer.
  2. Collection of contribution from employee & employer.
  3. Payment of benefits details.