Migrating from Oracle Discoverer to Microsoft SSRS

A Private, Women’s Liberal Arts College Migrates from Oracle Discoverer to MS SSRS in 8 weeks.

Retiring legacy reporting tools

The client is an internationally-recognized, private, women’s liberal arts college located in Massachusetts. As an IT-savvy institute they use Oracle Discoverer for all MIS requirements. After Oracle stopped supporting Discoverer in mid-2017, they sought an optimal migration path to migrate their analytical reports to a more intuitive platform, ultimately selecting Microsoft SQL Server Reporting Services (MS SSRS) to replace Discoverer.

Having already installed and used MS SSRS for years, the client avoided the need for a separate implementation project and additional user training. In this case, the three primary constraints were time, cost, and resources needed to migrate. The client’s internal IT team did not have the availability to manually migrate from Discoverer to MS SSRS, and needed ennVee’s help to automate the migration process.

Customer Snap Shot

  • Private women’s liberal arts college
  • Location: Massachusetts, USA
  • 2300 students
  • Annual endowment: ~$2 billion USD

Solution Process

Due to the constraints for both cost and time, we aimed to automate as much of the migration process as possible by utilizing our proprietary ennSight tool, which automatically identifies, assesses, and extracts all reports from Discoverer, including security and business rules.

To start, we conducted a comprehensive assessment of the client’s Discoverer reporting environment. The assessment produced a detailed snap shot of all reports broken down by department, owner, duplicates, complexity of the SQL, and estimated effort to migrate. to help them determine which reports to migrate, sunset, and consolidate. Then, the client went back to its business users and had them scrub the list for any duplicates or obsolete reports. The final decision for what to migrate consisted of 500 essential reports covering four tracks, HRMS, FINANCE, FACULTY, and PAYRPT.

After, the ennVee team used ennSight to automatically extract all 500 identified reports from Discoverer, identifying all user roles and user assigned schema. Based on the extraction each report was divided into one of three categories, Simple, Medium, or Complex. This categorization allowed us to determine the complexity and effort required to convert and test the workbooks and reports. Post-extraction, we automatically migrated each report “as is”. This allowed us to significantly expedite the migration process, and gave the client the opportunity to review and decide whether any enhancements were required in the final MS SSRS output. Finally, each report was unit tested before being deployed to the client’s server.

Total time to migrate: 8 weeks

discoverer to MS SSRS migration diagram - wordpress.png



All 500 Discoverer reports were successfully migrated to MS SSRS in eight weeks without any loss of data or disruptive rebuild. Automation helped the customer reduce the amount of complexity involved, as well as time, cost, and resources required to migrate. Rather than dedicate its internal staff to manually migrating the reports, the ennVee team handled and automated the process, affording the customer additional time to focus on strategic projects.

Additionally, Discoverer users are able to create substantially more intuitive reports via the self-service MS SSRS reporting platform. They can also move reports to Microsoft Power BI to augment the analytics or visual effects. Overall, automation has enabled the college to dedicate more time to strategic initiatives, while simultaneously migrating from Oracle Discoverer in just eight weeks.

Visit our website to learn more about automated Oracle Discoverer migration, or contact us at +1 888-848-6059.

Read more customer success stories »

An Automotive Parts Manufacturer’s Digital Vision Quest

Crafting an interactive digital portal for the partner, customer, and automotive aftermarket ecosystem at large.

Going Digital

This US-based global manufacturer leads the automotive aftermarket in the research, development, manufacturing, testing, and supply of brake system components. Over time they amassed a diverse portfolio of parts and products through acquisitions of other leading parts suppliers and most importantly, continuous innovation in the automotive aftermarket space.

Each of its major brand has its own separate website, and all data is fed back and forth between the main corporate website. While the client was in the process of modernizing each site, their greater digital plan required an interactive web portal geared towards its customers, partners, and the automotive aftermarket ecosystem at large. They sought ennVee’s help to design a solution that would enable seamless sharing of product and other technical information to their partners, customers, and ecosystem.

Key Requirements

ennVee’s solution included building a portal that would provide access to all product details, search catalog, Technical Bulletins, and various company-related and/or sponsored events. ennVee would also need to re-build the underlying product search engine and catalog database from the ground up.

Build an interactive web portal that will enable the customer to accomplish the following:

  • Accelerate growth of business with partners and customers

  • Create an unparalleled customer experience by imparting a digital world of information related to all products, parts, technical bulletins, news, and events

  • Increase digital presence and the number of active registered users via exclusive access to the portal and eCatalog.


Web Portal At A Glance

ennVee worked closely with the client to build a digital portal solution that caters to its global ecosystem of customers and partners.

Key Features Breakdown

Portal Catalog Case Study diagram4



  • Provides an overview of the product
  • Provides details for new and re-manufactured starters, alternators, and steering systems

Technical bulletins

  • Periodically uploaded by the customer’s Product Engineering team
  • Provides details related to any new or updated products, parts, or services


  • Articles and write-ups on various parts standards and quality aspects

“News and Events

  • Company and industry-related events
  • Filter by georgraphy


  • This is the database that contains comprehensive information on the customer’s product portfolio
  • Responsive and provides an optimal viewing experience across all types of devices
  • Search for parts from the home page itself
  • Locate parts via the actual Part Number or VIN number
  • Search by selecting options from a drop-down menu, and the vehicle’s category, make, model, or year. Any parts that match the search criteria are subsequently displayed.
  • View the details of each part including Engine, Amperage Rating, Fitment Notes, Grade, Voltage, Pulley details, Regulator Plug Position, Rotation Direction, Engine, Grade, Power rating, Starter rotation, Tooth quantity, Voltage etc.

Detailed View

  • Multiple images are available to view for each part
  • Download technical documentation (PDF files) to any device
  • Part details can be printed from any device
  • Users can select multiple parts to compare side-by-side in a tabular layout
  • Generate a detailed review of each part
  • Users can input competitor part numbers to view related parts
  • Comprehensive search using criteria like begins with, contains, ends with, and exact match based on the available input. A list of parts matching search criteria is displayed along with a detailed view that can be displayed by selecting each part.
  • Any part-related inquiries can be submitted directly through the portal

Project Topology

  • HTML, CSS, JavaScript
  • jQuery
  • Ajax
  • Java
  • REST Web Services
  • MS SQL Server
  • Magnolia CMS
  • Adobe Photoshop
  • Sublime
  • Eclipse
  • Apache Tomcat
  • Load balanced instances: Development (2), Staging (2), Production (2)

Business Benefits

The digital portal is a strategic addition to the portfolio of facilities that the client offers to its global customer and partner ecosystem. A cornerstone of its digital strategy, the portal will enable the customer to provide a seamless omni-channel experience, leading to more business and greater loyalty from both partners and customers. The portal has also been integrated properly with the client’s back-office Oracle Enterprise Resource Planning (ERP) system, automatically synchronizing and sending data to each department (Sales, Marketing, etc.).

The digital journey is ongoing and requires continuous tweaks and enhancements to maximize Return on Investment. With this in mind, the portal is scalable, allowing the customer to rapidly introduce new features without disrupting normal business operations.

Visit our website to learn more about ennVee Digital, or contact us at +1 888-848-6059.

Read more digital success stories »


NCOAUG 2018 Training Day

Connect with ennVee at the annual NCOAUG Training Day on March 9th at Drury Lane in Oakbrook Terrace, IL.

The annual NCOAUG winter Training Day will be held on Friday, March 9th at the Drury Lane Theatre in Oakbrook Terrace, Illinois. This is a great opportunity for Oracle users to network and catch up on the latest Oracle cloud, applications, and IT industry trends through a mix of educational sessions and discussion panels.

The North Central Oracle Applications User Group (NCOAUG) is part of the national Oracle Applications User Group (OAUG), and has been providing outstanding Oracle education for the seven state region including Illinois, Iowa, Wisconsin, Indiana, Michigan, Minnesota and Ohio, since 1994.

This year, ennVee has joined as one of the conference sponsors and also submitted an abstract for a chance to present under the Oracle E-Business Suite learning track. Veera Venugopal (President), and Joe Bong (Vice President, Global Sales & Marketing) anticipate speaking about automation best practices for Oracle E-Business Suite R12.2 upgrades, as well as a summation of the results from our 2017 IT leadership survey.

We look forward to meeting new and familiar faces in the Midwest. Make sure to stop by our booth to meet the ennVee team.


Register to attend and learn more by visiting the NCOAUG website.

Dealing With Unsanitized Data

The amount of data in the world is growing rapidly every day — but this data needs to be analyzed correctly. Check out what you need to consider when dealing with unsanitized data.

The amount of data in the world is growing rapidly every day — but this data needs to be analyzed correctly. Check out what you need to consider when dealing with unsanitized data.

By: Arun Yaligar


Big data is not just a buzzword. It is indeed a very important concept with a considerable impact on business in general. Big data is a vast collection of various kinds of structured and unstructured data gathered from inner and outer resources which, after processing and analyzing, can be turned into valuable insights. Conventional database techniques can’t be applied to big data processing. In today’s information- and technology-dependent world, there is a burning need for new effective techniques to handle data and make most out of it. Real-time data collection provides us with the opportunity to know about customer preferences in real-time. Big data enables the segmentation of customers, a customized approach, and the ability to target the audience more precisely and in a more well-prepared way.

Challenges Faced When Using Unsanitized Data

First of all, all that data needs to be analyzed correctly. The following points are important to consider when dealing with unsanitized data.

  • Identifying the correct filters is crucial. The amount of information is overflowing but not all of it is relevant or useful. Not all of the available information needs to or should be ingested and processed. Setting the right filters so that you don’t miss the important data will determine the ultimate success of the analysis.
  • Large amounts of extraordinary data call for big data environments because traditional data computing and processing won’t do here. For efficient analysis, big data processing should be performed automatically. Big data startups should elaborate an appropriate approach to storing and structuring information in the most efficient way.
  • Automatically produce metadata to enhance research, while still keeping in mind that computer systems may have defects and cause false results.
  • There is an urgent need for qualified people who can handle, analyze, and structure data. Innovation is progressing very quickly, and information is streamed from multiple resources. Developing a smart approach to prioritizing and processing big data is vital, though it is quite difficult to find people who possess the right skills.
  • Big data startups should consider privacy and security issues, as well.

Best Practices to Avoid Dealing With Unsanitized Data

Let’s look at potential solutions for challenges involving the three Vs — data volume, variety, and velocity — as well as privacy, security, and quality.

Potential Solutions for Data Volume Challenges

Let’s talk about Hadoop, visualization, robust hardware, grid computing, and Spark.


Tools like Hadoop are great for managing massive volumes of structured, semi-structured, and unstructured data. As it is a new technology, many professionals are unfamiliar with Hadoop, and using it requires a lot of learning. This eventually diverts the attention from solving the main problem towards learning Hadoop.


Visualization is another way to perform analyses and generate reports, but sometimes, the granularity of data increases the problem of accessing the level of detail needed.

Robust Hardware

It is also a good way to handle volume problems. It enables increased memory and powerful parallel processing to chew high volumes of data swiftly.

Grid Computing

Grid computing is represented by a number of servers that are interconnected by a high-speed network; each of the servers plays one or many roles.


Platforms like Spark use a model plus in-memory computing to create huge performance gains for high-volume and diversified data. All these approaches allow firms and organizations to explore huge data volumes and get business insights. There are two possible ways to deal with the volume problem. We can either shrink the data or invest in good infrastructure to solve the problem of data volume, and based on our budget and requirements, we can select the most appropriate technology or method.

Potential Solutions for Data Variety Challenges

Let’s look at OLAP tools, Apache Hadoop, and SAP HANA.

OLAP (Online Analytical Processing) Tools

Data processing can be done using OLAP tools to establish connections between information and assemble data logically in order to access it easily. OLAP tools specialists can quickly process high-volume data. One drawback is that OLAP tools process all the data provided to them regardless of the data’s relevancy.

Apache Hadoop

Hadoop is an open-source software whose main purpose is to manage huge amounts of data in a very short amount of time with great ease. The functionality of Hadoop is to divide data among multiple systems infrastructure for processing it. A map of the content is created in Hadoop so it can be easily accessed and found.


SAP HANA is an in-memory data platform that is deployable as an on-premise appliance or in the cloud. It is a revolutionary platform that’s best suited for performing real-time analytics as well as developing and deploying real-time applications. New database and indexing architectures make sense of disparate data sources swiftly.

Potential Solutions for Velocity Challenges

Let’s talk about flash memory, transactional databases, and cloud hybrid models.

Flash Memory

Flash memory is needed for caching data, especially in dynamic solutions that can parse that data as either hot (highly accessed data) or cold (rarely accessed data).

Transactional Databases

According to Tech-FAQ, “A transactional database is a database management system that has the capability to roll back or undo a database transaction or operation if it is not completed appropriately.” They are equipped with real-time analytics to provide a faster response to decision-making.

Cloud Hybrid Models

Expanding the private cloud using a hybrid model uses less additional computational power for data analysis and helps select hardware, software, and business process changes to handle high-pace data needs.

Potential Solutions for Quality Challenges

Let’s talk about data visualization and big data algorithms.

Data Visualization

If the data quality is the concern, visualization is effective because it lets us see where outliers and irrelevant data lie. For quality, firms should have a data control, surveillance, or information management process active to ensure that the data is clean. Plotting data points on a graph for analysis becomes difficult when dealing with an extremely large volume of data or data with a wide variety of information. One way to resolve this is to cluster data into a higher-level view where smaller clusters or bunches of data become visible. By grouping the data together, or “binning,” you can more effectively visualize the data.

Big Data Algorithms

Data quality and relevance are not new concerns. It’s been a concern ever since we started dealing with data and how to store every piece of data a firm produces. It is too expensive to have dirty data and it costs companies hundreds of billions of dollars every year. In addition to being perfect for maintaining, managing, and cleaning data, big data algorithms can be an easy way to clean the data. There are many algorithms and models and we can also make our own algorithms to act on data.

Potential Solutions for Privacy and Security Challenges

Let’s talk about examing cloud providers, having an adequate access control policy, and protecting data.

Examine Your Cloud Providers

Storing big data in the cloud is a good way of storage. But along with this, we need to take care of its protection mechanisms. We should make sure that our cloud provider has frequent security audits and has a disclaimer that includes paying penalties in case adequate security standards have not met.

Must Have an Adequate Access Control Policy

Create policies in such a way that allows access to authorized users only.

Protect the Data

All stages of data should be adequately protected from the raw data. There should be encryption to ensure that no sensitive data is leaked. The main solution to ensure that data remains protected is the adequate use of encryption. For example, attribute-based encryption ( a type of public-key encryption in which the secret key of a user and the ciphertext are dependent upon attributes) provides access control of encrypted data.


Everything has two sides. Opportunities and challenges are everywhere. Threats should be considered and not neglected.

We use different techniques for big data analysis including statistical analysis, batch processing, machine learning, data mining, intelligent analysis, cloud computing, quantum computing, and data stream processing. There is a great future for the big data industry and lots of scope for research and improvements.



Arun Yaligar is a MuleSoft Developer at ennVee. 

This article was originally posted to The Integration Zone 

Client Case Study- Integrating ServiceMax and Oracle EBS

A leading manufacturer of commercial printing solutions integrates Oracle EBS and ServiceMax to increase cost savings and time optimization.

Solution Background

The customer provides graphic and precision solutions and is headquartered in Japan with its North America office located in Chicago, IL. They are one of the largest manufacturers and suppliers of system components for the prepress and printing industries worldwide. 

After implementing ServiceMax to manage their after-sales operations, the customer needed to integrate ServiceMax with its back-office Oracle E-Business Suite (R12) system. The customer needed ennVee’s help to create a holistic solution, enabling the seamless and automated exchange of data between Oracle E-Business Suite and ServiceMax.


Key Requirements:

Build interfaces between Oracle E-Business Suite and ServiceMax to automatically:

  • Send and receive all order information to process the order in Oracle Order Management.
  • Synchronize the work order information in both systems.
  • Handle all errors, exceptions, or reconciliation requirements.


ennVee Solution

ennVee developed interfaces to connect Oracle E-Business Suite and ServiceMax, and designed a custom solution to automate the following processes:

  • Raising and Tracking Service Requests in ServiceMax, and integrating the requests with Oracle E-Business Suite.
  • Maintaining parts prices in sync across both systems.
  • Sending Work Order information to ServiceMax.
  • Creating labor orders in Oracle Order Management before processing them for invoicing.
  • Validating the order, returning used parts to inventory, and taking the line to closure in Oracle E-Business Suite.


Eight key business process functionalities were addressed:

  1. A work order is entered into ServiceMax and a sales order is entered in Oracle E-Business Suite whenever there is request from a customer for parts installation.
  2. A sales order entered into Oracle E-Business Suite will contain the quantity of the order placed and picked by the Engineer.
  3. An interface sends the information to ServiceMax, and the work order number in ServiceMax is updated along with the parts that are selected for installation.
  4. After the installation is done at customer site, labor and expense charges are entered in ServiceMax for the respective work order.
  5. Information is brought into Oracle E-Business Suite to create a sales order for the invoices and labor charges.
  6. The Engineer inputs the parts used during installation into the work order and completes the work order.
  7. This information is also brought into Oracle E-Business to perform a back order for any unused parts and to close the order line.
  8. There is an interface that runs daily to synchronize the prices of the parts in Oracle E-Business and ServiceMax, and sends over any parts that have undergone a price change or were recently added (new parts).


Solution Approach

We designed and built four interfaces that would provide the various functionalities required by the customer. 

  • Interface 1 – Outbound from Oracle EBS (to extract the parts and pricing information to be sent to ServiceMax). Data will be extracted from Oracle EBS into a .CSV file, which will be sent to an inbound ServiceMax server via FTP.
  • Interface 2 – Outbound from Oracle EBS. Whenever a service request is raised for installation of parts, a sales order is created in Oracle EBS. This order will have customer information, work order number, parts (items number), quantity, price, engineer bin location, shipped quantity, picked quantity, etc. All of this information is extracted and sent to ServiceMax via .CSV file.
  • Interface 3 – Inbound to Oracle EBS. An Order for Labor charge will be entered in ServiceMax and sent to Oracle EBS for a sales order creation. After, a sales order for the labor/service charges will be created in Oracle EBS and invoice-ready.
  • Interface 4. The engineer will enter the parts used and all detailed information when the work order number is closed in ServiceMax. This information will be pushed to Oracle EBS, similar to the work order number, parts consumed, closing comments, work order closed date, etc.


Business Benefits

As a result of the new integration between ServiceMax and Oracle E-Business Suite, the customer benefitted immensely from a cost savings and time optimization standpoint. With an automated process for raising, tracking, and accessing Service Requests in Oracle E-Business Suite, the customer was able to reduce operational costs, nearly eliminating any manual intervention in the Service Request process. Automating the flow of data between systems enabled drastic business growth and yielded a faster exchange of data across the system.


Visit our website for additional case studies.