Sep 22, 2010

Exam #1 and #2: What I learned while Studying for 70-433 and 70-451

I learned some things while studying for the database development exams (70-433 and 70-451) that I plan to use on a regular basis on the job!

  1. sp_spaceused 'tablename'
    Above statement will show the space used by an existing table.
  1. sp_estimate_data_compression_savings 'dbo', 'myTable', NULL, NULL, 'PAGE'
    Above statement will show the space that could be saved if you implement PAGE (or ROW) level data compression on the table.
  1. You want to list all objects (procs, views, functions) that reference dbo.mytablename. Be sure to include the schema name of the table (i.e. dbo).
    Select * from sys.dm_sql_referencing_entities ('dbo.mytablename', 'OBJECT')
  1. You want to see when a procedure was last run... since the server was rebooted:

    select s.last_execution_time
    from sys.dm_exec_query_stats s
    cross apply sys.dm_exec_query_plan (s.plan_handle) p
    where object_name(p.objectid) = 'myProcedureName'
  1. After studying the TRY/CATCH, here's what I've decided to use as my favorite CATCH statement:

    IF XACT_STATE <> 0
    ROLLBACK

    DECLARE @ErrorMessage varchar(max)
    DECLARE @ErrorSeverity int
    DECLARE @ErrorState int

    SET @ErrorMessage = ERROR_PROCEDURE() + ': ' + cast(ERROR_LINE() as varchar)
    + ': ' + ERROR_MESSAGE()
    SET @ErrorSeverity = ERROR_SEVERITY()
    SET @ErrorState = ERROR_STATE()
    RAISERROR (@ErrorMessage, @ErrorSeverity, @ErrorState)
  1. You may already know that parameter sniffing happens automatically... so the first time a query is compiled it is compiled for the variables you send it that first time. Or you can optimize a query for a certain value as follows...

    SELECT * from myTable WHERE myColumn=@t OPTION (OPTIMIZE FOR (@t=7);

    Kinda neat huh? Also, if you cannot update the procedure (maybe it is 3rd party software), then you can optimize the way the procedure is run by using the sp_create_plan_guide. Google for it!
  1. Did you know that tables don't have a fillfactor? Only indexes do. And when you have a clustered index on a table, you can specify the fillfactor and then your table will have padded space in it. Interesting, huh?

    ALTER INDEX ALL on myTable REBUILD WITH (FILLFACTOR=60)
    This will first rebuild the clustered index with a fillfactor of 60% (i.e. 40% empty space). It will then rebuild the other non-clustered indexes with a fillfactor of 60%.
  1. You want to identify indexes that are incurring maintenance overhead and are not being used much.

    SELECT * FROM sys.dm_db_index_usage_stats
    WHERE database_id = DB_ID()
    -- and object_id = object_id('myTable')
    ORDER BY user_seeks, user_scans, user_updates desc

    The user_seeks and user_scans tell you how often the index was found useful. The user_updates tells the amount of maintenance overhead the index has cost you.
  1. You want to see how much fragmentation is in your tables or indexes. If it is 30% or more, then you'll want to rebuild the index (i.e. "Alter Index All on myTable REBUILD with online=on")

    SELECT avg_fragmentation_in_percent, object_name(object_id), *
    FROM sys.dm_db_index_physical_stats (DB_ID(), object_id('CRM_BCS_Payment_Data_MIRROR'), null, null, null)
  1. You want to list the queries with the highest average execution times.

    Select TOP 20 total_worker_time/execution_count as AverageCPUTime
    plan_handle, query_plan
    FROM sys.dm_exec_query_stats as qs
    CROSS APPLY sys.dm_exec_query_plan (qs.plan_handle)
    ORDER BY 1 DESC



These are pretty common and helpful as well...

  1. SET STATISTICS_IO ON
    Above command will also display the amount of disk activity that is generated by any T-SQL statements after they execute.
  1. SET STATISTICS_TIME ON
    Above command will also display the number of milliseconds required to parse, compile, and execute any T-SQL statements after they execute.
  1. SET SHOWPLAN ALL ON
    This command causes SQL Server not to execute T-SQL statements. Instead, it will show the estimated execution plan and compile-time details of how the statements would be executed.


There were also lots of things I learned about XML and XQUERY (@x.query, @x.value, @x.exist, and @x.modify), but I don't think I'll be using them much since we don't store XML in our databases at work.

There were tons of other little things I learned while studying for the exam. But these things listed above are the things I plan to keep with me :-)

Sep 16, 2010

Exam #2: 70-451 (PRO: Designing Database Solutions and Data Access Using Microsoft SQL Server 2008)

I took and passed my second exam! When I was taking the exam, I thought I was gonna score a 97% or so... since I thought I had everything right hehehe... but when I clicked on the submit button it churned for a second or two and gave me my score of 85%. I was disappointed! Okay, it's better than the 70% which is required to pass so it's all good. But I was really surprised by that low a score, because I really thought I had all the answers right! Wishful thinking I guess!

As you may know, Exam 70-451 is an extension of Exam 70-433. So it's best to take them back-to-back. You can see from a previous blog what I had studied for Exam 70-433, so all I did for Exam 70-451 was the following:

  • I bought the Transcender Practice Exams for 70-451. The cost was about $221 (which included an exam voucher). I went through all 168 of these Transcender questions and took notes on the ones that were interesting. I highlighted and studied my notes.

Then I took the exam! Simple as pie. And here's my new logo for my resume...