Apply for Business Intelligence Certification Now!!
The term ‘backup and recovery’ is referred to as the various policies and strategies deployed for protecting database against any loss and reconstructing it after any kind of data loss.
Physical Backups and Logical Backups – This represents a copy of data from the database that can be used to reconstruct that data. Physical backups refer to the backups of the physical files used in storing and recovering database, such as control files, data files and archived redo logs. Eventually every physical backup becomes a copy of files storing database details to some other location – on disk or some offline storage, such as tape. Logical backups consists of logical data, such as stored procedures or tables, exported from a database with an Oracle export utility and stored in a binary file, to re-import later into a database through the concurrent Oracle import utility.
Physical backups form the basis of any sound backup and recovery strategy, whereas, logical backups supplements physical backups many a times but might not be sufficient to protect against data loss without physical backups.
The term “backup” used in the backup and recovery documentation generally signifies physical backups, and to back up some or entire database is to take some kind of physical backup. The focus in the backup and recovery documentation is mostly on physical backups.
Monitoring and Managing Data Growth
Data warehousing is unique in many respects. Here, high performance is not a concern. In fact, if performance is deemed to be acceptable, it is good enough. However, there are certain things that require high level of monitoring as well amid data warehouse analytical processing. Firstly, the analyst needs to be cautious of dormant data, which is residing in a data warehouse, is not being accessed. In the early days of a data warehouse, there is hardly any dormant data to be found. But as a data warehouse grows older, dormant data starts collecting.
Dormant data in a data warehouse environment is like cholesterol in the bloodstream of the body. With enough cholesterol, the heart has to pump extra hard just to move blood around. Further, dormant data costs money. It costs both in terms of wasted storage, and processor cycles that are needed to move data through a system. Dormant data is not good for data warehousing environment.
Following are the major types of activities that occur in the data warehouse
- data loads
- data archives
- queries and reports
- backups/recoveries
Data Usage Monitoring – You must determine the usage of data usage by identifying the number of data rows that are / are not being used, and by identifying the data columns that are / are not being used. Data usage must be determined at the row and column level instead at the table level since only one row in a table may be accessed or even if most rows in the table are accessed, only a few columns can be accessed. Data usage monitoring also discovers when data is used (period of usage). Data not used ever or is infrequently used should be removed from the data warehouse or moved to near-line or off-line storage.
Data Warehouse Users Monitoring – You must also identify which users access the data warehouse frequently and which users do not. This helps in identifying heavy users of the data warehouse and allows the data warehouse support personnel to provide additional support.
Response Time Monitoring – This helps you in identifying problems and ascertaining that adequate response times are maintained. The response times should be measured over a long period of time to adequately assess what is really happening.
Data Warehouse Activity Monitoring – You must identify the makeup of the workload going through the data warehouse. If you identify the types of activities, such as large or small queries, and when they occur (number of hours), it will help you in identifying any performance issues.
Managing Data Growth
How do you prevent data warehouse requirements from surpassing your ability to manage them? The first step is to determine the business drivers. Make certain that your business-level requirements and schedules indicate those drives correctly. Consequently, ensure to define those requirements by year over a strategic time frame (usually three to five years). Thirdly, develop a set of concrete usage scenarios symbolizing the workload that the business requirements will demand.
With the business drivers and requirements clear, you can focus more closely on the data-warehouse issues, including database size, structure, workload, and service-level agreements. Build a margin of safety into your requirements; you’ll never have complete certainty about how the system will be used and what might ultimately expand those requirements. The next step is to evaluate the long-term and near-term trade-offs
- In the long term, the ability to leverage data from across the enterprise pays huge dividends, and is the key to most of the success stories you hear about. Clearly, the core of your enterprise data must be integrated to provide a platform for rapid and inexpensive implementation of analytical solutions.
- In the near term, you want to identify where you can gain significant cost or performance advantages in a specific application or sandbox with a data-mart or appliance approach. Remember that some benefits may come at the price of fragmenting decision support data and incurring greater overhead for replication, ETL and other data movement and integration.
- It is important to test your intended solution against your detailed requirements before committing to it. This also applies when it comes to handling scalability: the platform, configuration, database design and so on.
http://www.vskills.in/certification/Certified-Business-Intelligence-Professional