Friday, May 31, 2013

[SQL Server 2008 issues] Storing XML data

[SQL Server 2008 issues] Storing XML data


Storing XML data

Posted: 30 May 2013 07:18 PM PDT

I have a XML date in this format:<? xml version="1.0" encoding="UTF-8"?><ns0:GVRedReq xmlns:xsi='http://www.w3.org/2001 /XMLSchema-instance'xmlns:ns0='http://fks.com/gv/redemption'xsi :schemaLocation='http://fks.com/gv/redemption GVRedReq.xsd'><ns0:StrCode>1234</ns0:StrCode><ns0:TrmnlId>02</ns0:TrmnlId><ns0:TrnsNo>0123456789<ns0:TrnsNo><ns0:Date>03-05-2013</ns0: Date><ns0:Time>11:30</ns0:Time><ns0:GVDtls><ns0:GVNo> VVD0000021</ns0:GVNo><ns0:GVNo> VVD0000022</ns0:GVNo><ns0:GVNo> BVD0003213</ns0:GVNo></ns0:GVDtls></ns0:GVRedReq>I want to store it in my database table. Its structure is like below:Create table GV_redeemRequest( RedeemRequestID int Primary key identity(1,1) StoreCode char(4), TerminalID int, TransactionNO int, Date date, Time char(5), GVDetails varchar(max))How to achieve this?

Creating views on multiple tables

Posted: 29 May 2013 10:32 PM PDT

Hi,Is it possible to create views using multiple tables from multiple databases????I have 4 servers which are connected remotely and the same database resides on all the servers, I want to create view by using all the databases of all the servers??I want to do this to make the database available for the user regardless of the connection that they connect to which server?? means data must be available to users either they connect to one server or other???? your valuable guidelines and suggestions are welcome.Thanks in advanceBILAL AHMAD

facing issue in looping through in SOL to XML

Posted: 30 May 2013 06:56 PM PDT

I have the following query which is supposed create xml for each employee.Here is the query....select E.FirstName + ' ' + E.LastName as 'emp/@Name',P.PositionName as 'emp/@Position',D.DepartmentName as 'emp/@Department', ( select distinct B.EventName as 'event/@ID', ( Select (CAST(case when AAA.BookingID=AB.BookingID then ( select CONVERT(varchar(10),max(BF. FromDate) ,103) as DateCompetencyAchieved) else '1 May 1900' end as Varchar ))) as 'event/@AttendanceDate' from #ReportMaster X, Booking B,MU_Attendee_Bookings AAA where B.BookingID=AAA.BookingID order by B.EventName for xml PATH(''), TYPE )as 'emp'............The output is as follows: <r caption="Site of Person Involved" fieldName="SitePI"> <emp Name="name1" Position="Manager" Department="Admin"> <event ID="99hh" AttendanceDate="20/05/2013" /> <event ID="MS Excel" AttendanceDate="xx" /> <event ID="MS Office" AttendanceDate="xx" /> </emp> <emp Name="Name2" Position="Supplier" Department="Accounts"> <event ID="99hh" AttendanceDate="xx" /> <event ID="MS Excel" AttendanceDate="10/05/2013" /> <event ID="MS Office" AttendanceDate="xx" /> </emp> <emp Name="Name2" Position="Supplier" Department="Accounts"> <event ID="99hh" AttendanceDate="xx" /> <event ID="MS Excel" AttendanceDate="xx" /> <event ID="MS Office" AttendanceDate="01/08/2013" /> </emp> </r>What I need as follows. I do not want the name2 to repeat but combine the two outputs related to name2. <r caption="Site of Person Involved" fieldName="SitePI"> <emp Name="name1" Position="Manager" Department="Admin"> <event ID="99hh" AttendanceDate="20/05/2013" /> <event ID="MS Excel" AttendanceDate="xx" /> <event ID="MS Office" AttendanceDate="xx" /> </emp> <emp Name="Name2" Position="Supplier" Department="Accounts"> <event ID="99hh" AttendanceDate="xx" /> <event ID="MS Excel" AttendanceDate="10/05/2013" /> <event ID="MS Office" AttendanceDate="01/08/2013" /> </emp> </r>Can anyone please help if I am doing a mistake in looping through the event dates for each employee?

Passing one value with a @parameter but return two customers

Posted: 30 May 2013 06:54 PM PDT

Hi, I have a problem that I have struggled with for a couple of hours now, without being able to solve it.I have a report, that returns funding for a particular customer. The customer name is passed to the underlying procedure as a parameter, @customer. So far everything is fine. But here comes the tricky part. In my customer database, there have been some errors in the registration,so one of my customers have been registered with two different names, lets say customer A and customer B.And in my procedure I have a condition saying that customerName = @customer.What I want to do is check the customer parameter, and if it is A or B, then my condition should return the data from both customers, not only A or B.One way would to create two @customer parameters in the report, and let the condition be customerName = @customerA OR customerName = @customerB but I would rather not use that approach.

Create a Leveling of Intermediaries

Posted: 30 May 2013 04:07 PM PDT

Hi, I have a table called Intermediary, which have some columns likeCREATE TABLE Intermediary(IntermediaryPK INT ,IntermediaryID NVARCHAR(20),IntermediaryName NVARCHAR(200),IntermediaryTypeID NVARCHAR(1),ParentIntermediaryID NVARCHAR(20),IntermediaryTypePK TINYINT,LevelID TINYINT,ParentIntermediaryPK INT,GrandParentIntermediaryPK INT,GrandParentIntermediaryID NVARCHAR(20))The basic idea is to[b] use only the [/b][b]ParentIntermediaryID column to create a leveling of intermediaries[/b]. The algorithm is that if someone does not have any children, he will be level 3, etc. So the algorithem is:1. Divide the intermediaries into levels. 1 = grand parent (has grand children). 2 = parent (has children). 3 = leaf (no children).2. For each level 3 intermediary:a. Set level 2 parent by the parent ID givenb. Set level 1 parent by the parent's parent ID given3. For each level 2 intermediary:a. Set level 2 parent as itselfb. Set level 1 parent by the parent ID given4. For each level 1 intermediary, set level 1 parent as itselfSo I have to write a script (SQL) to accomplish this algorithem. Please help me out.

need script

Posted: 30 May 2013 12:46 AM PDT

I need a script to list all columns which is having not null constraint on it. the script i have is..USE [R]GOSELECT t.name AS table_name,SCHEMA_NAME(schema_id) AS schema_name,c.name AS column_nameFROM sys.tables AS tINNER JOIN sys.columns c ON t.OBJECT_ID = c.OBJECT_IDWHERE c.name is not nullORDER BY schema_name, table_name;but it is returning the columns which is having null constraint also..can any one help

Create a Leveling of Intermediaries

Posted: 30 May 2013 04:07 PM PDT

Hi, I have a table called Intermediary, which have some columns likeCREATE TABLE Intermediary(IntermediaryPK INT ,IntermediaryID NVARCHAR(20),IntermediaryName NVARCHAR(200),IntermediaryTypeID NVARCHAR(1),ParentIntermediaryID NVARCHAR(20),IntermediaryTypePK TINYINT,LevelID TINYINT,ParentIntermediaryPK INT,GrandParentIntermediaryPK INT,GrandParentIntermediaryID NVARCHAR(20))The basic idea is to [b]use only the [/b][b]ParentIntermediaryID column to create a leveling of intermediaries[/b]. The algorithm is that if someone does not have any children, he will be level 3, etc. So the algorithem is:1. Divide the intermediaries into levels. 1 = grand parent (has grand children). 2 = parent (has children). 3 = leaf (no children).2. For each level 3 intermediary:a. Set level 2 parent by the parent ID givenb. Set level 1 parent by the parent's parent ID given3. For each level 2 intermediary:a. Set level 2 parent as itselfb. Set level 1 parent by the parent ID given4. For each level 1 intermediary, set level 1 parent as itselfSo I have to write a script (SQL) to accomplish this algorithem. Please help me out.

Need some help

Posted: 30 May 2013 03:53 PM PDT

Hi friends, Recently i faced an interview... i need answers for questions which i am not comfortable in delivering.1. difference between SQL 2008 and 2008 R22. what is virtual memory3. why we need MSTDC4. what tranactions involved in MSTDC4. what is ITIL5. what is Change Management6. what is Incident Management

SQL server Query

Posted: 30 May 2013 05:42 PM PDT

Hi ,Please tell me that what is Query Recasting with example.Regards,Anshuman Saini

Rounding issue

Posted: 30 May 2013 01:21 PM PDT

Hi everyone,I have problem with rounding. There is a table[code="plain"]CREATE TABLE [dbo].[Fin3PaymentDisb]( [PaymentID] [bigint] NOT NULL, [Currency] [char](3) NOT NULL, [Amount] [money] NOT NULL, [Rate] [float] NOT NULL, [CurrencyPay] [char](3) NULL, [RatePay] [float] NOT NULL, [AmountPay] AS ([Amount]*[Rate]), CONSTRAINT [PK_Fin3PaymentDisb] PRIMARY KEY CLUSTERED [/code]with valuesAMOUNT = 3875.0000RATE = 0.17836AMOUNTPAY = 691.145 And this query returns record[code="plain"]SELECT P.*FROM Fin3Payment P INNER JOIN ( SELECT PaymentID, SUM(AmountPay) AS Amount FROM Fin3PaymentDisb GROUP BY PaymentID ) AS L ON L.PaymentID = P.PaymentID AND ROUND(Round(L.Amount, 2) - ROUND(P.Amount - P.Offset - P.SetDisc, 2), 2) <> 0[/code]where PaymentAmount = 691, Offset = -0.15 and SetDisc = 0, i.e. 691.145 is rounded to 691.14. But in the query[code="plain"]SELECT ROUND(691.145, 2)[/code]result is 691.15.What could be the problem here? It runs on MS SQL 2005 Express SP2.Thanks.

How to calculate space needed to modify a column in a table?

Posted: 30 May 2013 01:39 PM PDT

I have a situation that's driving me INSANE...The vendors of our financial processing system came to us asking that we increase a column in a table from char(15) to char(19) to allow for extra room in a needed field from the web application. Typically this type of change wouldn't such a big deal, however this table has about 90 million rows in it and has a data footprint of about 214GB.Now unless I'm missing something (which is very possible, as I'm no expert at anything), to issue an ALTER COLUMN like this via SSMS it involves the creation of a new "temp" table, transfer the data from the existing table to the new table, create all non-clustered indexes on the new table, and drop the old table, rename the new.Consider the command:[code="sql"]A-LTER TABLE MyProblemChild A-LTER COLUMN [FleetCardVehicleNo] char (19) NULLGO[/code]I did this on a testing server with attached raided storage and it took about 1.5hrs and the log file was pre-sized at 230GB - not too badWhen I attempted to run the same process on our production server (which has the data, index, and log files on a SAN, each residing on their own respective LUNS), the process ran for over 5 hours, grew the log file to over 490GB...and only stopped there because the LUN ran out of space... to which I had to kill/rollback the entire thing!How can I calculate how much log file space will be needed to successfully complete the operation?

Need MDX query

Posted: 30 May 2013 01:53 PM PDT

hi All,I need one mdx query to display the first three digits of month name please give any example queries

Strange BETWEEN behavior....anyone have an explination for this?

Posted: 30 May 2013 08:13 AM PDT

I wrote some code that will basically search an entire directory of .trn files(Log ships in my case). It uses the RESTORE HEADERONLY function and inserts the results in to a temp table and then uses the BETWEEN function ,the FirstLSN column and the lastlsn column to find the LSN you are looking for. I'll post the code in its entirety below(In case someone would find it useful) but this is the query that actually does the aforementioned function using between more or less:QUERY:select convert(NUMERIC(38), FirstLSN) as 'FirstLSN', convert(NUMERIC(38), lastlsn) as 'lastlsn'from #filelist where 17017000006794800011 BETWEEN FirstLSN AND LastLSNRESULT:FirstLSN lastlsn17017000005762601000 17017000006794799000 (WRONG)17017000006794799000 17017000007787600000 (RIGHT)The #filelist table (Which contains the results of each HEADERONLY query) defines FirstLSN and LastLSN as FLOAT and the other fields as varchar. It returns the correct result but also it returns an additional record that seems to be erroneous. In the full version below I use a variable called @SearchLSN which also is defined as type FLOAT. This is necessary for the BETWEEN function as it only compares like types.Anyone have any idea why BETWEEN would return two results with one being incorrect? The wrong record btw is always the record immediately before the correct record in the table.Here is the code in its entirety in case its useful for someone(Please excuse the formatting):[code="sql"]/*This script will search for a particular LSN number within a directory of .trn files. The log file must use .trn for theextension. It will also return ALL the infromation for every file if you enter a 0(zero) for the search lsn.Written by kenneth.gore@gmailDOTcomUse as you wish, modify as you wish..................Need more? Email me :)*/SET QUOTED_IDENTIFIER OFF GOSET ANSI_NULLS ON GOset nocount ondeclare @restoreFromDir varchar(255) = '\\Place\Your\directory\path\here\without\backslash' --Holds the location of the .trn filesdeclare @SearchLSN Float = 17017000006794800011 --Holds the lsn you are searching for. A value of 0 to diplay ALL the information from every file.declare @DisplayasWeGo bit = 1 -- Determines wether information for each file will be displayed as its found or JUST the file that you are searching for at the end of the query.if exists ( select * from tempdb.dbo.sysobjects o where o.xtype in ('U') and o.id = object_id(N'tempdb..#dirList'))DROP TABLE #dirList;if exists ( select * from tempdb.dbo.sysobjects o where o.xtype in ('U') and o.id = object_id(N'tempdb..#filelist'))DROP TABLE #filelist;declare @filename varchar(100), @cmd varchar(500)create table #dirList (filename varchar(255))create table #filelist ( BackupName varchar(255),BackupDescription varchar(255),BackupType varchar(255),ExpirationDate varchar(255),Compressed varchar(255),Position varchar(255),DeviceType varchar(255),UserName varchar(255), ServerName varchar(255),DatabaseName varchar(255),DatabaseVersion varchar(255),DatabaseCreationDate varchar(255), BackupSize varchar(255),FirstLSN Float, LastLSN Float,CheckpointLSN Float, DatabaseBackupLSN Float, BackupStartDate varchar(255),BackupFinishDate varchar(255), SortOrder varchar(255), [CodePage] varchar(255), UnicodeLocaleId varchar(255),UnicodeComparisonStyle varchar(255), CompatibilityLevel varchar(255),SoftwareVendorId varchar(255),SoftwareVersionMajor varchar(255), SoftwareVersionMinor varchar(255),SoftwareVersionBuild varchar(255),MachineName varchar(255),Flags varchar(255),BindingID varchar(255),RecoveryForkID varchar(255),Collation varchar(255),FamilyGUID varchar(255),HasBulkLoggedData varchar(255),IsSnapshot varchar(255),IsReadOnly varchar(255),IsSingleUser varchar(255),HasBackupChecksums varchar(255),IsDamaged varchar(255),BeginsLogChain varchar(255),HasIncompleteMetaData varchar(255),IsForceOffline varchar(255),IsCopyOnly varchar(255),FirstRecoveryForkID varchar(255),ForkPointLSN varchar(255),RecoveryModel varchar(255),DifferentialBaseLSN varchar(255),DifferentialBaseGUID varchar(255),BackupTypeDescription varchar(255),BackupSetGUID varchar(255),CompressedBackupSize varchar(255),)select @cmd = 'dir /b /on "' + @restoreFromDir+ '"'insert #dirList exec master..xp_cmdshell @cmd --select * from #dirList where filename like '%.trn%' --order by filenamedeclare BakFile_csr cursor for select * from #dirList where filename like '%.trn%' --order by filename--select * from #dirList where filename like '%.trn%'open BakFile_csrfetch BakFile_csr into @filenamewhile @@fetch_status = 0 begin select @cmd = "RESTORE HEADERONLY FROM DISK = '" + @restoreFromDir + "\" + @filename + "'" insert #filelist exec ( @cmd ) if @DisplayasWeGo = 1 exec ( @cmd ) PRINT '' PRINT 'Getting information from ' + @filename fetch BakFile_csr into @filename end -- BakFile_csr loopclose BakFile_csrdeallocate BakFile_csrif @SearchLSN = 0 select BackupName , BackupDescription , BackupType , ExpirationDate , Compressed , Position , DeviceType, UserName , ServerName , DatabaseName , DatabaseVersion , DatabaseCreationDate , BackupSize , convert(NUMERIC(38), FirstLSN) as 'FirstLSN' , convert(NUMERIC(38), lastlsn) as 'lastlsn' , convert(NUMERIC(38), CheckpointLSN) as 'CheckpointLSN', convert(NUMERIC(38), DatabaseBackupLSN) as 'DatabaseBackupLSN', BackupStartDate , BackupFinishDate , SortOrder , [CodePage] , UnicodeLocaleId , UnicodeComparisonStyle , CompatibilityLevel , SoftwareVendorId , SoftwareVersionMajor , SoftwareVersionMinor , SoftwareVersionBuild , MachineName , Flags , BindingID , RecoveryForkID , Collation , FamilyGUID , HasBulkLoggedData , IsSnapshot , IsReadOnly , IsSingleUser , HasBackupChecksums , IsDamaged , BeginsLogChain , HasIncompleteMetaData , IsForceOffline, IsCopyOnly , FirstRecoveryForkID , ForkPointLSN , RecoveryModel , DifferentialBaseLSN , DifferentialBaseGUID , BackupTypeDescription , BackupSetGUID , CompressedBackupSize from #filelistElse select 'The LSN searched for was found in the following transaction log backup.' as "Search Results", BackupName , BackupDescription , BackupType , ExpirationDate , Compressed , Position , DeviceType, UserName , ServerName , DatabaseName , DatabaseVersion , DatabaseCreationDate , BackupSize , convert(NUMERIC(38), FirstLSN) as 'FirstLSN' , convert(NUMERIC(38), lastlsn) as 'lastlsn' , convert(NUMERIC(38), CheckpointLSN) as 'CheckpointLSN', convert(NUMERIC(38), DatabaseBackupLSN) as 'DatabaseBackupLSN', BackupStartDate , BackupFinishDate , SortOrder , [CodePage] , UnicodeLocaleId , UnicodeComparisonStyle , CompatibilityLevel , SoftwareVendorId , SoftwareVersionMajor , SoftwareVersionMinor , SoftwareVersionBuild , MachineName , Flags , BindingID , RecoveryForkID , Collation , FamilyGUID , HasBulkLoggedData , IsSnapshot , IsReadOnly , IsSingleUser , HasBackupChecksums , IsDamaged , BeginsLogChain , HasIncompleteMetaData , IsForceOffline, IsCopyOnly , FirstRecoveryForkID , ForkPointLSN , RecoveryModel , DifferentialBaseLSN , DifferentialBaseGUID , BackupTypeDescription , BackupSetGUID , CompressedBackupSize from #filelist where @SearchLSN BETWEEN FirstLSN AND LastLSNdrop table #dirListdrop table #filelistreturnGOSET QUOTED_IDENTIFIER OFF GOSET ANSI_NULLS ON GO[/code]

Help With Loading XML File into Varbinary Field

Posted: 30 May 2013 08:48 AM PDT

HelloI have a table that has column with Varbinary(MAX) Data type. This column typically contains XML File or Zipped XML File. I have a requirment to replace that field value with New XML File. Can someone help me with the script? I have used OPENROWSET function to insert the xml file as new record but couldn't use that function for UPDATE.

VIEWS

Posted: 30 May 2013 04:57 AM PDT

I would like to display x amount of views (50) in one report then in that same report display which views that have not been accessed and their fields.

Merge between two tables , on two different databases.

Posted: 30 May 2013 06:51 AM PDT

Hi friends,I have following issue:I have one table called customer in Cust database in SQL SERVER 2008R2, which is my destination table.and Other table called customer(same name as above and also same fields) on the oracle server, which is my source table.I have access of read only in the table which is on the oracle server under the LINKED SERVER folder in SQL SERVER 2008R2.now I am confuse how can I do the incremental ETL,(insert , update and delete) between source and destination table either via Stored procudre or SSIS package.Please help me.

Help with next date in SQL query

Posted: 17 May 2013 10:04 PM PDT

What I need is to be able to find out when a customer next called in from the date I have select with the next date & time on any Purchase, does don't need to be same. It's like using MIN & MAX but that does not work on this query as it select the same date & time for both.SELECT ID, Name, CallDateTime, Num, RCFROM HistoryWHERE (CallDateTime > CONVERT(DATETIME, '2013-05-01 00:00:00', 102)) AND (Outcome = 'Called')As you can see in the query above that all the data is in one overall table called History, this records all the purchases.So I want to know that if a customer after the 1/05/2013 called in with the outcome of called what was he next purchases date, some customer might not have come so that can stay blank. So the query is like this nowID Name CallDateTime Num RC3936803 Name1 01/05/2013 11:16:27 84 Called5211387 Name2 01/05/2013 12:14:21 604 Called5185689 Name3 01/05/2013 12:15:28 298 Called4811923 Name4 01/05/2013 12:29:36 170 Calledbut i also want it to show the below,ID Name CallDateTime Num RC Next CallDateTime Total Number Of Days3936803 Name1 01/05/2013 11:16 84 Called 04/05/2013 11:16 35211387 Name2 01/05/2013 12:14 604 Called 04/05/2013 12:14 35185689 Name3 01/05/2013 12:15 298 Called 04/05/2013 12:15 34811923 Name4 01/05/2013 12:29 170 Called 04/05/2013 12:29 3This is the query I have at the moment BUT not show how to add two more columns to it they are next called in date & time after the first called in. I want it to show the next time & date does not matter what the RC code is next visit.So I want mine output to show the below, adding the next called time & date and how many days it take from the first date to the next date,

Servers Consolidation and Linked Servers

Posted: 28 May 2013 07:55 AM PDT

Hi,We are planning to consolidate 5 servers in a 2-node active/active cluster.Some of the servers have Linked server between them.After the consolidation, I guess Linked server between Instances inside same Server/Cluster won't be necessary anymore, but I can't figure out how i could substitute them so that instances keep communicating.Any idea ?Thanks,Dok

Service Master and Database keys

Posted: 30 May 2013 04:16 AM PDT

I've inherited some SQL Servers (2008, 2008R2) how can I tell if there is a Service Master Key or any Database keys?

Export table with Geography data type

Posted: 30 May 2013 03:17 AM PDT

Hi allas the title suggests I need to export the data from one table to another database in a completely different domain, but I ma having problems as its of a Geography data type. Using the export wizard its complaining about data type conversions, does anyone know of a better way of doing this?Thanks

No formatting when importing into Excel using connection

Posted: 30 May 2013 07:13 AM PDT

Hey I've searched around for this and found a topic, but I think they were trying to do something a bit different.http://www.sqlservercentral.com/Forums/Topic976362-391-1.aspx?Highlight=no+format+excel+connectionI think they were trying to have the formatting done on the fly while I'd like to have my formatting preset in Excel and have the connection not change column lengths, etc when it imports since I want to put some excel calculation driven tables below my connection table that have field names of different lengths.Does anyone know of a simple solution for doing this? I thought I'd seen it done before in a file I inherited from someone at a previous employer, but haven't had any luck finding a solution recently.Thanks in advance.

SQL Consulting Help

Posted: 23 May 2013 11:41 PM PDT

My company is in need of a SQL Server consultant. We need someone to come in and do an evaluation of our SQL server configuration, look at our databases, look at our SSRS configuration, look at our indexes (and lack of), etc., etc., etc. And when all said and done, we would like for them to leave us with a plan on how to monitor performance and action plan for maintaining performance.I am asking for recommendations as I do not know where to look for this type of work. I'm sure there are thousands of companies I could Google, but I'm hoping someone here can help me cut through the chees. We'd want to call around to talk to references to vouch for this company/individual as well.Does anyone have any recommendations?Thank you.Joel

Backup problem - log cant back up in time before next log back up

Posted: 30 May 2013 04:54 AM PDT

HelloOur SAN admin takes care of the SQL backups and is running into a problem and asked me for advice.He is doing a log backup every 3 hours.50 GB of changes are being recorded in the log within the 3 hours.The time it takes to back up is exceeding 3 hours causing the next log backup to fail.With the log backup failing, the log keeps growing and the drive gets full.I'm not sure what type of activity is being done or if it can be changed. What are the options to avoid this. Does the RPO need to be changed?ThanksDave

Question on SARGability

Posted: 30 May 2013 05:55 AM PDT

I have a reporting project where the users can choose to filter on multiple field parameters. The filter can be a specific field, or "ANY" to not filter on that particular column.In a nutshell, my T-SQL to pull data is simplified down to what's pasted below. In reality, there are multiple joins involved and the filter fields aren't necessarily in the same tables.My question is whether the following T-SQL might be considered SARGable, or if there's a preferred best practice that might work better in this case. Some of these queries are in fairly long stored procedures so dynamic SQL isn't necessarily a good option.Thanks in advance for your thoughts,Andre RanieriSELECT somefielda, somefieldb, somefieldc, somefieldd)FROM SomeTableWHERE somefielda = (CASE WHEN @filtera = 'ANY' THEN somefielda ELSE @filtera END) ANDsomefieldb = (CASE WHEN @filterb = 'ANY' THEN somefieldb ELSE @filterb END) ANDsomefieldc = (CASE WHEN @filterc = 'ANY' THEN somefieldc ELSE @filterc END) ANDsomefieldd = (CASE WHEN @filterd = 'ANY' THEN somefieldd ELSE @filterd END) AND

find dates where over x # of rows were created?

Posted: 30 May 2013 06:18 AM PDT

Hello - I need to query a table to find out dates on which more than a certain number of records were created. The CreatedOn/ModifiedOn columns have a datetime data type. This is just an ad hoc query so performance is not important for this query. Thanks!

NOLOCK hint.

Posted: 28 May 2013 03:08 AM PDT

we have a high activity transactional database and we have put NOLOCK hints on all SELECT statements...for sometime now we are experiencing timeouts inspite of NOLOCK hintI thought with NOLOCK hint we should not get timeouts ?The same queries run OK from SSMS and we dont see issue with optimization..the problem occurs when WCFmakes hundreds of hits per min....[It is OK for us to to have dirty reads because of NOLOCK hints but we cant have timeouts...]

using dbcc show statistics

Posted: 30 May 2013 12:51 AM PDT

Is it possible to identify candidate keys using dbcc show statistics and if so how would i do it?I am married to a download process that moves everything all the time because the business stewards maintain that there is no way to tell one record from another, so they must extract the entire system every night.This flies in the face of reality to me because if there is no way to tell what is unique, how can their application work correctly?But the fact is that their application is a 35 year old file based system that has no database, that application logic does everything, e.g., there is no database back end minding the store.Can this command help me identify what combination of columns are candidates for uniqueness so that we do not have to download and reload the same zillion records every night?thanks

Scalar Valued functions

Posted: 30 May 2013 02:59 AM PDT

I have a Stored Proc that calls a Scalar Valued function...When I run the Scalar Valued function select dbo.f_ScalarFunction (@Param1, @Param2, @Param3) and check Inlude Actual execution plan..I see that it does a CONSTANT SCAN ... ??I see my underlying Queries in the function are well optimized...they do Index Seek if run in isolation.Question is Does Scalar Valued function not utilize the Query plan of the underlying SQL Queries ?Do they always do CONSTANT SCAN ?

Data Compression - CPU Impact

Posted: 30 May 2013 02:46 AM PDT

I have a SQL 2008 R2 Enterprise server that handles a strictly reporting based workload. The database has grown very quickly, and I am looking for ways to extend the lifespan of my current storage. I have identified some tables that would be good candidates for row and/or page compression, however I am curious about the actual CPU impact of doing so. My current server is somewhat CPU bound in that a good number of reports are able to fully utilize several CPU cores on the server, however the server clearly has more cores than are required for its workload. This brings me to my question about the CPU utilization impact of page/row compression.Are the CPU resources required to compress/decompress data as it is read from disk handled in separate CPU thread(s) from the threads that are performing the actual query workload? By that I mean since I clearly have additional CPU cores available to do work, could I expect that these "unused" cores would pick up the slack of decompressing the data so that the CPU resources needed to perform the query logic would still run at peak efficiency?Thanks.

BCP Failure

Posted: 30 May 2013 12:33 AM PDT

I am using bcp to create a text file. It was working fine but stopped working. Here is the bcp command.exec master..xp_cmdshell 'bcp ''select clientmatter from mytemp order by acctid'' queryout e:\sql_backup\test.txt -c, -T' The text file is not being created. When I manually place the file in the target directory, the email will not work with the @file_attachments='e:\sql_backup\test.txt'. This is the error msg. Executed as user: NT AUTHORITY\NETWORK SERVICE. Attachment file e:\sql_backup\test.txt is invalid. [SQLSTATE 42000] (Error 22051). The step failed.When I remove the @file_attachments='e:\sql_backup\test.txt', the email will work. Thanks in advance. Mike

Policy Managment, Log Backups

Posted: 30 May 2013 02:06 AM PDT

Hello,I have created a policy that violates when a database does not have a log backup within a 15 minute time span. It works, however it alerts me of databases that are not in the maintenance plans and therefore should not be backed up. Does anyone know how to exclude these?I have a function that checks for databases that are accessible, in full recovery mode, and not system databases. I use this condition with the policy but it does not allow me to pick specific databases and I can't find a facet that checks if a database is in a maintenance plan.If you are still with me by the end of this, the email we get is not very descriptive. Is there a way to mention what databases triggered the violation in the email? Thanks for any help...

Linked servers migration

Posted: 30 May 2013 01:58 AM PDT

I have to migrate linked servers from SQL server 2005 to SQL server 2008 R2.Could somebody please tell me how to do this? should i script it out??

How to build/configure Reporting server in sql server 2008

Posted: 29 May 2013 08:36 PM PDT

Dear All,1-We have sql server 2008 with 2 nodes with cluster(Active/Passive).2-Due to performance problem we are going to separate the reports from database.3-I want to create reproting server,as i am new for this pls let me know the step by step process to create reproting server.Thanks in advance..Regards,

The log for database is not available; Database was shutdown due to error 9001 in routine 'XdesRMFull::Commit'.

Posted: 04 May 2011 07:39 AM PDT

Testing and Production db's have lost connection to the translog 2 days in a row now. Today, my prod db was marked suspect after the issue - SCARY. The other db's did not lose connection. Possibly because there was no activity at that moment. No errors in SQL log, only windows. Server Resources were not necessarily be hammered. I will be scouring the web, but wanted to reach out to all of you as well. See info and errors below.Plenty of available drive space for Log, db, and tempdb partitions. 144gb RAMSQL Server 2008 SP1; Enterprise (64-bit)OS: Win Server 2008 R2 Enterprise[b]Win app logs:[/b]error1- LogWriter: Operating system error 21(The device is not ready.) encountered.error2 - The log for database (testing) is not available. Check the event log for related error messages. Resolve any errors and restart the database.info mess3- Database was shutdown due to error 9001 in routine 'XdesRMFull::Commit'. Restart for non-snapshot databases will be attempted after all connections to the database are aborted.[b]2 seconds later prod db goes down:[u][/u][/b]error4- The log for database is not available. Check the event log for related error messages. Resolve any errors and restart the database.error5 - During undoing of a logged operation in database, an error occurred at log record ID (86400:39070:17). Typically, the specific failure is logged previously as an error in the Windows Event Log service. Restore the database or file from a backup, or repair the database.error6 - fcb::close-flush: Operating system error (null) encountered.error7 - An error occurred during recovery, preventing the database (PRODUCTION :w00t:) from restarting. Diagnose the recovery errors and fix them, or restore from a known good backup. If errors are not corrected or expected, contact Technical Support.info mess8 -CHECKDB for database finished without errors on 2011-03-14 12:12:41.503 (local time). This is an informational message only; no user action is required.

Delete take too much time

Posted: 29 May 2013 10:08 PM PDT

Hi,I m try to delete some old data to speed up select statement but it takes too much time to delete.delete from Table_name where CAST( ReportTime as date) < CAST((GETDATE()-30) as date)table have 1,00,000 records to delete.it takes more than 15 minutesThanks in advance

No comments:

Post a Comment

Search This Blog