Skip to main content

The Working of Microsoft SQL Server CDC

 Modern-day businesses have to preserve historical data and take measures to prevent data breaches. In this regard, Microsoft took the lead in 2005 when it launched the SQL Server CDC.

The 2005 version of SQL Server CDC had certain flaws which were ironed out in an updated release in 2008. Some of the functionalities included tracking and capturing all changes that take place in the SQL Server database tables without taking the help of additional programs and applications. Till 2016, SQL Server CDC was offered by Microsoft in its high-end Enterprise editions but later was available in the Standard version too.

SQL Server CDC captures and records all activities like Insert, Update, and Delete applied to a SQL Server. Column information and metadata required for posting changes to the target database are recorded in modified rows that are then stored in change tables representing the architecture of the columns in the tracked source tables. SQL Server CDC also tracks all changes recorded in the mirrored tables. 



There are additional columns tracking changes at the source tables and the data present is in the following format.

·        __$start_lsn and __$end_lsn that show the commit log sequence number (LSN) assigned by the SQL Server Engine to the recorded change

·        __$seqval that shows the order of that change related to other changes in the same transaction, __$operation that shows the operation type of the change, where 1 = delete, 2 = insert, 3 = update (before change), and 4 = update (after change)

·        __$update_mask that is a bitmask defined for each captured column, identifying updating columns

SQL Server CDC is an excellent method to preserve historical data. 

Comments

Popular posts from this blog

Capturing Data with the SAP Extractor

The SAP Extractor is a program in SAP ERP that can be both customized or taken from a standard Data Source. It prepares and captures data through an extract structure that can be transferred to the Business Warehouse of SAP. Both the options of the SAP Extractor help to describe a delta load process or various types of full load. The SAP BW can remotely access the various data transfer activities of the SAP Extractor . For more on SAP Extractor, click here. SAP Extractor executes SAP data extraction in three ways. The first is Content Extraction used to extract BW content, FI, HR, CO, SAP CRM, and LO cockpit. The second is Customer-Generated Extraction where the SAP Extractor is used for LIS, FI-SL, CO-PA. The third is Generic Extraction which is based on DB View, Infoset, Function Modules. The SAP Extractor used for a specific extraction activity depends on the particular needs of an organization. Data capturing and extraction with the SAP Extractor is initiated with the h...

The Evolution of Technology of Oracle Change Data Capture

Oracle change data capture ( CDC) was first launched with the 9i version as an in-built tool of the Oracle database. It was a tool that recorded and monitored all changes made in the user tables in a database. These changes were then stored in change tables and used in ETL applications for later processing and transferring to other data warehouses and databases. The release version of Oracle change data capture   had triggers placed in the source database. However, database administrators found this technology very invasive and did not favor it. Ultimately, Oracle changed the Oracle change data capture   technology and released it with the 10g version after naming it Oracle Streams.  The working of this release was different. Oracle change data capture   used the redo logs of the source database along with a replication tool of Oracle Streams. This technology turned out to be very successful and a highly optimized method to identify and move change data to a target ...

Building a Data Lake on Amazon Simple Storage Service

Amazon Simple Storage Service (S3) is a cloud-based data storage service that stores data in its native format. Data durability of S3 is always at a high of 99.999999999 (11 9s), and the data regardless of the volume is stored in a fully secured and safe ecosystem. In Amazon S3, data files that contain metadata and objects are stored in buckets for uploading. For metadata and files, the object is to be uploaded to S3. After this step, permissions can be granted on the metadata or related objects stored in the buckets. Many competencies can be used when an S 3 data lake   is built on Amazon S3. These include media data processing applications, Artificial Intelligence (AI), Machine Learning (ML), big data analytics, and high-performance computing (HPC). When all these are used in conjunction, businesses get access to critical data, business intelligence, and analytics from S3 data lake and unstructured data sets. There are several benefits of the S3 data lake. The first is differe...