Wednesday, December 6, 2017

How to send email notification On Success and On Failure in SQL Agent job

By default, you can set only one of the following options to send email in SQL Agent. Available options are:
  1. When the job succeeds
  2. When the job fails
  3. When the job completes
In most cases, this is not useful, because job might need varieties of emails such as on success or on failure. There are many ways to do this, however, the easiest way to do this just add multiple Steps and configure the main Step based on the event.
In this example, once the Load job success the Go to Step [2] and on the job failure Go to Step [3]
In here Step 2:
EXEC msdb.dbo.sp_send_dbmail  
@profile_name = 'DataWarehouse',  
@recipients = 'uma.bale@xxxx.co.nz,
@body = 'Data load successfully completed',  
@subject = Data load successfully completed’;


In here Step 3:
EXEC msdb.dbo.sp_send_dbmail  
@profile_name = 'DataWarehouse',  
@recipients = 'uma.bale@xxxx.co.nz,  
@body = 'Data load failed',  
@subject = Data load failed’;

If you want you can use single T-SQL step and send custom email notification based on the outcome. For this CASE statement or IF condition can be used.
Cheers!
Uma

Wednesday, November 15, 2017

What is the main difference between Change Data Capture (CDC) and Change Tracking (CT)

SQL Server provides two features that track changes to data in a database: change data capture and change tracking. These features enable applications to determine the DML changes (insert, update, and delete operations) that were made to user tables in a database.
Change Data Capture
Change data capture provides historical change information for a user table by capturing both the fact that DML changes were made and the actual data that was changed. Changes are captured by using an asynchronous process that reads the transaction log and has a low impact on the system.
Change Tracking
Change tracking captures the fact that rows in a table were changed, but does not capture the data that was changed. This enables applications to determine the rows that have changed with the latest row data being obtained directly from the user tables. Therefore, change tracking is more limited in the historical questions it can answer compared to change data capture. However, for those applications that do not require the historical information, there is far less storage overhead because of the changed data not being captured. A synchronous tracking mechanism is used to track the changes. This has been designed to have minimal overhead to the DML operations.
The main feature differences is Change Data Capture keep historical data and Change tracking only keep latest change using version control.
To enable Change Tracking for Database:
ALTER DATABASE LearningCT
SET CHANGE_TRACKING = ON
(CHANGE_RETENTION = 2 DAYS, AUTO_CLEANUP = ON)
To enable Change Tracking for Table:
ALTER TABLE Employee
ENABLE CHANGE_TRACKING;
or
ALTER TABLE Employee
ENABLE CHANGE_TRACKING
WITH (TRACK_COLUMNS_UPDATED = ON);
One of the main condition before enable Change Tracking, the table should have primary key otherwise you will get the following error.
Cannot enable change tracking on table 'Employee1'. Change tracking requires a primary key on the table. Create a primary key on the table before enabling change tracking
The only parameter is TRACK_COLUMNS_UPDATED. When set to OFF, only the fact that the row has been changed is stored. When set to ON, the information which columns were updated is also stored, which can be read using the HANGE_TRACKING_IS_COLUMN_IN_MASK function, as shown in Part II. As this option adds overhead, it is set to OFF by default.
You can see the difference with and without these options
You can join this change tracking table with original table to find the change data.
SELECT Employee.*
FROM Employee
LEFT JOIN CHANGETABLE(CHANGES Employee, @latest_version) AS CHANGES
    ON Employee EmployeeID = CHANGES.EmployeeID
WHERE (CHANGES.SYS_CHANGE_OPERATION IN ('I', 'U')
In here, @latest_version can be derived via:
SELECT CHANGE_TRACKING_CURRENT_VERSION ()


Please refer the following links for more details
Cheers! Uma

Wednesday, September 6, 2017

How to filter empty or null records in SQL

You can use CASE or AND statement to filter empty or null records. Recently, I just show something interesting to perform this task using NULLIF.

Cheers! Uma

Tuesday, August 15, 2017

How to find age on a particular date

This function can be used to calculate the Age on a particular date.


CREATE FUNCTION dbo.FindAge
(
@Dob    DATETIME,
@ToDate DATETIME
)
RETURNS INT
AS
    BEGIN
        RETURN ROUND((CASE
                          WHEN @dob > @ToDate
                          THEN NULL
                          WHEN DATEPART(day, @dob) > DATEPART(day, @ToDate)
                          THEN DATEDIFF(month, @dob, @ToDate) - 1
                          ELSE DATEDIFF(month, @dob, @ToDate)
                      END / 12), 0);
    END;


Please refer why you should not use DATEDIFF function directly and use above logic.


Cheers!
Uma

Monday, July 3, 2017

How to use ORDER BY in a view

You can’t use ORDER BY in view, if you try to include ORDER BY in View, it will throw the following error. This is a limitations of the view in SQL SERVER. However you can use ORDER BY with TOP 100 PERCENT, don’t think too much that’s how it works :)

View Order by.jpg


Cheers! Uma

Friday, June 16, 2017

How to check if temporary / permanent objects exists and drop if it exists before creating them

There are many ways to perform drop the existing objects before create them. One of the common way use OBJECT_ID as shown below.


--for a permanent table you can use
IF OBJECT_ID('dbo.tabl1', 'U') IS NOT NULL
 DROP TABLE dbo.tabl1;
--for a temporary table you can use
IF OBJECT_ID('tempdb.dbo.#tbl1', 'U') IS NOT NULL
 DROP TABLE #tbl1;


In SQL Server 2016, new function DROP IF EXISTS introduced to perform this task. Currently, the following objects can DIE.
--from SQL Server 2016 and later
DROP PROCEDURE IF EXISTS
dbo.tabl1, dbo.tab2, dbo.tab3, dbo.tab4, dbo.tab5, #tbl1;
--to drop Stored Procedure
DROP PROCEDURE IF EXISTS
dbo.sp1, dbo.sp2;
Cheers!
Uma

Wednesday, May 10, 2017

How to setup a Real-Time data analyze using Azure Stream Analytics and PowerBI

These days’ demand is high for Real-Time data analytics and can easily implement the system using cloud technologies. Real-time analytics is the use of, or the capacity to use, data and related resources as soon as the data enters the system.
Few well-known applications of real-time analytics:
  • Real-time credit card fraud detection  
  • CRM real-time analytics can provide up-to-the-minute information about an enterprise's customers and present it so that better and quicker business decisions.
  • Traffic detection
  • Wind speed
In this post, using a real-time feed from Twitter and Azure Cloud technologies.
You can find Azure Event Hub and Azure Stream Analytics under Internet Of Things marketplace category.
Step1: Create an Azure Event Hubs
Azure Event Hubs is a highly scalable publish-subscribe service that can invest millions of events per second and stream them into multiple applications. This lets you process and analyses the massive amounts of data produced by your connected devices and applications.
Use Event Hubs to:
  • Log millions of events per second in near real time.
  • Connect devices using flexible authorization and throttling.
  • Use time-based event buffering.
  • Get a managed service with elastic scale.
  • Reach a broad set of platforms using native client libraries.
  • Pluggable adapters for other cloud services.
Step2: Create Azure Stream Analytics
Azure Stream Analytics is a fully managed, cost-effective real-time event processing engine that helps to unlock deep insights from data. Stream Analytics makes it easy to set up real-time analytic computations on data streaming from devices, sensors, websites, social media, applications, infrastructure systems, and more.
With a few clicks in the Azure portal, you can author a Stream Analytics job specifying the input source of the streaming data, the output sink for the results of your job, and a data transformation expressed in a SQL-like language. You can monitor and adjust the scale/speed of your job in the Azure portal to scale from a few kilobytes to a gigabyte or more of events processed per second.
Stream Analytics leverages years of Microsoft Research work in developing highly tuned streaming engines for time-sensitive processing, as well as language integrations for intuitive specifications of such.
In this case Power BI used as stored data.

Used query:
SELECT Topic,count(*) AS Count, Avg(SentimentScore) AS AvgSentiment, System.Timestamp AS Insert_Time
FROM [tweets-input] TIMESTAMP BY CreatedAt
GROUP BY TumblingWindow(second,5), Topic
Step3: Feed real-time data to Event Hub
In this case, Twitter WPF Client used as real-time data feeder. To use this, you should create a twitter app and access token key for that.
You can create apps https://apps.twitter.com
Step4: Crete a Dashboard using real-time data
Once you start your Azure Stream Analytics, you will find data source will be available in Powerbi.com

Cheers!
Uma