Tuesday, March 12, 2013

[SQL Server 2008 issues] Encryption doubts

[SQL Server 2008 issues] Encryption doubts


Encryption doubts

Posted: 11 Mar 2013 07:21 PM PDT

I want to encrypt my database. So which method is simple and the best. If i use symmetric key then does i encrypt it by using database master key or password? Which is safe here?

Query optimization

Posted: 07 Mar 2013 04:09 PM PST

Hello friends,I have table contains millions of records.When i fired query to get record by id. Query get millions of rows for that it take 2-3 second time.How can i optimize query?My sql query is:SELECT * FROM Table1 WHERE intId = @intId AND bintTimeStamp >= @bintStartTimeStamp AND bintTimeStamp < @bintEndTimeStamp ORDER BY bintTimeStamp

Larger Date Range taking MUCH less time

Posted: 10 Mar 2013 09:32 PM PDT

HiI'm executing the below TSQL on a table with about 700,000 records. I am clearing the caching so it does not affect the queries.When the data range is between '2013-01-01' and '2013-01-31' (or even more), it takes less than one second to return the data.But, when the date range is between '2013-01-09' and '2013-01-31' it takes ages (4 min)Can anyone give me an explanation why this is happening.ThanksJP[quote]DBCC DROPCLEANBUFFERSgoDBCC FREEPROCCACHEgoSelect STLStkCode From STranLine Where STranLine.STLTranDate Between '2013-01-09' and '2013-01-31' [/quote]

inserting data from one table to another and want to create primary key at same time

Posted: 11 Mar 2013 04:36 PM PDT

I am creating a table by using an insert into from another tablethen I am deleting any duplicate rows from that tablethen I try to create a primary key on column1(SN_Original)but I get an error saying Msg 8111, Level 16, State 1, Line 27Cannot define PRIMARY KEY constraint on nullable column in table 'lookuptable'is there a way round thishere is my codeDrop table lookuptableselect dbo.bigtable.Software_Name_Original as SN_Original , dbo.bigtable.Software_Name_Raw as SN_New into lookuptablefrom dbo.BigTableorder by dbo.bigtable.Software_Name_Raw --Delete duplicate rows from lookup table based on SN_NEW (software_name_new)DELETE fFROM (select row_number() over (partition by SN_NEW order by SN_NEW) as rn from lookuptable) AS fWHERE rn > 1 ALTER TABLE lookuptable ADD CONSTRAINT SN_OriginalPKPRIMARY KEY CLUSTERED (SN_Original);

Inserting currency symbols in money datatypes

Posted: 07 Mar 2013 03:21 PM PST

Hi,I heard we can insert currency Symbols like $ to Money and small money datatypes. But while i am trying i realized we can insert $ into money datatype but it will not display symbol while viewing table content. Am i missing anything or it is like that only? If so What is need of money datatype because we can use decimal or nvarchar only?Thank you

maximum rows without duplication

Posted: 11 Mar 2013 08:26 AM PDT

Consider this table:declare @a table (a int,b int,c char(1))insert into @aselect 1,17,'a' union allselect 1,15,'b' union allselect 2,15,'c' union allselect 3,14,'d' union allselect 4,13,'e' union allselect 3,13,'f' union allselect 5,12,'g' union allselect 6,12,'h' union allselect 6,11,'i' union allselect 7,10,'j' union allselect 8,9,'k' union allselect 8,10,'l'My goal is to get the maximum rows in my result without any duplicate values within columns a or b across all rows. In this case I should get 8 rows. I tried these two queries below but I only get 6 or 7 rows respectively due to an eliminated row causing a qualifying row to be eliminated because they have the same value in a particular column. Thanks!!------------------------------------------------------------------------------------SELECT cFROM (SELECT ROW_NUMBER() OVER (PARTITION BY a ORDER BY c asc) Rank_a ,ROW_NUMBER() OVER (PARTITION BY b ORDER BY c asc) Rank_b ,c FROM @a) tWHERE rank_a = 1AND rank_b = 1------------------------------------------------------------------------------------SELECT cFROM (SELECT ROW_NUMBER() OVER (PARTITION BY b ORDER BY c asc) Rank_b,c FROM (SELECT ROW_NUMBER() OVER (PARTITION BY a ORDER BY c asc) Rank_a,b,c FROM @a) t WHERE rank_a = 1) uWHERE rank_b = 1------------------------------------------------------------------------------------

my backup maintenance plan is failing

Posted: 10 Mar 2013 08:48 PM PDT

Hi friends my backup plan is failing from error logs i got this information. can any one throw some light on it.3/6/2013Executing the query "EXECUTE master.dbo.xp_create_subdir N'z:\\Core'..." failed with the following error: "xp_create_subdir() returned error 3, 'The system cannot find the path specified.'". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly.Executing the query "BACKUP DATABASE [Core] TO DISK = N'z:\\Core\\..." failed with the following error: "Cannot open backup device 'z:\\Core\\Core_backup_2013_03_06_021517_6310479.bak'. Operating system error 3(The system cannot find the path specified.).BACKUP DATABASE is terminating abnormally.". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly.Executing the query "BACKUP DATABASE [ReportServer$SRVR1] TO DISK = N'..." failed with the following error: "Cannot open backup device 'z:\\ReportServer$SRVR1\\ReportServer$SRVR1_backup_2013_03_06_021517_7770625.bak'. Operating system error 3(The system cannot find the path specified.).BACKUP DATABASE is terminating abnormally.". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly.Executing the query "BACKUP DATABASE [SharePoint_Config_Dev] TO DISK =..." failed with the following error: "Cannot open backup device 'z:\\SharePoint_Config_Dev\\SharePoint_Config_Dev_backup_2013_03_06_021517_7990647.bak'. Operating system error 3(The system cannot find the path specified.).BACKUP DATABASE is terminating abnormally.". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly.

Find month totals between date columns

Posted: 11 Mar 2013 03:29 AM PDT

I have a sample view with some dates. How would you find the numbers of items open per month. Say between OpenDate and CloseDate I want to find how many were open for January, February,?Here is a sample table with the data[code="sql"]CREATE TABLE [dbo].[TestDate]( [ItemTitle] [nvarchar](50) NULL, [ItemAttachAssignDate] [date] NULL, [ItemDetachConcludeDate] [date] NULL, [Status] [nvarchar](50) NULL, [FullName] [nvarchar](100) NULL, [OpenDate] [date] NULL, [CloseDate] [date] NULL) ON [PRIMARY]GO[/code][code="sql"]INSERT INTO [TestDate]([ItemAttachAssignDate], [ItemDetachConcludeDate], [Status], [FullName], [OpenDate], [CloseDate]) VALUES('2013-02-18 00:00:00', '2013-02-19 00:00:00', 'Done', 'Jeff Hunter ', '2013-02-18 00:00:00', '2013-02-19 00:00:00');INSERT INTO [TestDate]([ItemAttachAssignDate], [ItemDetachConcludeDate], [Status], [FullName], [OpenDate], [CloseDate]) VALUES('2012-10-15 00:00:00', '2013-02-05 00:00:00', 'Done', 'Tommy Johnson', '2013-01-22 00:00:00', '2013-01-28 00:00:00');INSERT INTO [TestDate]([ItemAttachAssignDate], [ItemDetachConcludeDate], [Status], [FullName], [OpenDate], [CloseDate]) VALUES('2012-10-15 00:00:00', '2013-02-05 00:00:00', 'Done', 'Jeff Haynes', '2012-10-17 00:00:00', '2013-02-01 00:00:00');INSERT INTO [TestDate]([ItemAttachAssignDate], [ItemDetachConcludeDate], [Status], [FullName], [OpenDate], [CloseDate]) VALUES('2012-10-15 00:00:00', '2013-02-05 00:00:00', 'Done', 'Nancy Belkin', '2012-10-28 00:00:00', '2012-12-14 00:00:00');INSERT INTO [TestDate]([ItemAttachAssignDate], [ItemDetachConcludeDate], [Status], [FullName], [OpenDate], [CloseDate]) VALUES('2012-10-15 00:00:00', '2013-02-05 00:00:00', 'Done', 'Rudolph Porche', '2013-01-16 00:00:00', '2013-02-02 00:00:00');INSERT INTO [TestDate]([ItemAttachAssignDate], [ItemDetachConcludeDate], [Status], [FullName], [OpenDate], [CloseDate]) VALUES('2012-10-20 00:00:00', '2013-02-07 00:00:00', 'Done', 'Pat Franks', '2013-01-20 00:00:00', '2013-01-25 00:00:00');INSERT INTO [TestDate]([ItemAttachAssignDate], [ItemDetachConcludeDate], [Status], [FullName], [OpenDate], [CloseDate]) VALUES('2012-10-20 00:00:00', '2013-02-07 00:00:00', 'Done', 'Leslie Jordan', '2012-11-25 00:00:00', '2012-12-04 00:00:00');INSERT INTO [TestDate]([ItemAttachAssignDate], [ItemDetachConcludeDate], [Status], [FullName], [OpenDate], [CloseDate]) VALUES('2012-10-20 00:00:00', '2013-02-07 00:00:00', 'Done', 'Fred Haney', '2012-10-20 00:00:00', '2013-02-04 00:00:00');INSERT INTO [TestDate]([ItemAttachAssignDate], [ItemDetachConcludeDate], [Status], [FullName], [OpenDate], [CloseDate]) VALUES('2012-10-20 00:00:00', '2013-02-07 00:00:00', 'Done', 'Henry Hanks', '2012-10-31 00:00:00', '2012-11-15 00:00:00');[/code]

Is this is the Best Practice to select E: for the SQL Server root directory?

Posted: 11 Mar 2013 05:05 AM PDT

Hi Experts,Is this is the Best Practice to select E: or any other driver(not C:) for the SQL Server root directory? Please let me know the Pros/cons and the practices to be followed.Thanks a lot in adavance

How to view stats used in query?

Posted: 11 Mar 2013 09:23 AM PDT

Hello all,I could use some help trying to better understand how stats are used in this situation.I have a query that runs relatively poor (14 seconds); however the execution plan is actually quite good. It uses an index and performs a seek, no problems here. The overall plan is decent with some areas of improvement. Anyway, a developer of mine added an index and reduced the run time quite significantly (dropped to 3 seconds). Yet, after reviewing the index utilization, I found that the new index was not being touched (both DMV's and execution plan tell the same story) and instead it was still using the original index. So I then created a statistic which matched his index and I was able to achieve the same result in performance gains and yet my execution plan is 100% the same.With the above said, I'm curious to understand a couple things.1) How if at all, can you view which stats are touched during query execution?2) Why would the optimizer not use the statistic already on the index it used within the execution plan? (yes, I did update stats)3) What really determines how "auto create stats" takes effect?Thanks

SQL query 4 weeks 3 business days

Posted: 11 Mar 2013 02:19 AM PDT

Hello,I need to write a query that returns a result set that is within the past 4 weeks and 3 business days. For example, the result set for today would return everything from Feb 6th forward. (Date of this post was March 11).I have the second part but I am not sure how to include the 3 days. Any help would be appreciated.select dateadd(week,-4,getdate());-Dave

Converting datetime format

Posted: 11 Mar 2013 08:18 AM PDT

I have a following table:SET ANSI_NULLS ONGOSET QUOTED_IDENTIFIER ONGOCREATE TABLE [dbo].[Tbl1]( [Old_Date] [datetime] NULL, [New_Date] [datetime] NULL) ON [PRIMARY]INSERT INTO TBL1(Old_Date)values('2012-12-31')INSERT INTO TBL1(Old_Date)values('2013-01-01')INSERT INTO TBL1(Old_Date)values('2013-01-02')INSERT INTO TBL1(Old_Date)values('2013-01-03')with Old_Date getting the values from SSIS package with the date format 'YYYY-MM-DD'Using T-SQL, I am trying to convert the Old_dATE into New_date column with the format 'MM-DD-YYYY', but for some reason it is not working out..I tried following statements:SELECT OLD_DATE,CONVERT(DATETIME,OLD_DATE,110) AS NEW_DATE FROM TBL1But In sql server, I am seeing the same YYYY-MM-DD format in new_date instead of MM-DD-YYYYCan anyone help me out here..Thank you!!

To not RBAR

Posted: 11 Mar 2013 03:31 AM PDT

I have a table of taxonomic entries. It is for paleontology, where the entire taxonomic structure is often not known. Someone may find a clearly new type of brachiopod (phylum) and name in honor of his favorite musician Jamesus (genus) hetfieldii (species), but be unable to determine any of the intermediate taxonomic levels (class, order, family), and often the various auxiliary levels (sub-order, super-family) don't even exist. I needed a method to allow any taxonomic level to 'bind upwards' to any level above it. The table structure I created looks like this:[code="sql"]CREATE TABLE [dbo].[Taxonomy]( [TaxonAutoID] [int] IDENTITY(1,1) NOT NULL, [TaxonName] [varchar](100) NOT NULL, [TaxonLevelID] [int] NOT NULL, [NextHigherTaxonAutoID] [int] NULL, [AltTexts] [varchar](4000) NULL, CONSTRAINT [PK_Taxonomy] PRIMARY KEY CLUSTERED …[/code]where each entry has the identity column of the next higher taxonomic unit (whatever level that may be) as one of its properties. One of the things I need to do with this table is retrieve the entire string of entries, starting at some arbitrary point and proceeding upwards until I reach the top level (kingdom).Using RBAR, it's simple. A loop, using the NextHigherTaxonAutoID field of each entry for the search key of the next iteration, until I hit the top level. The absolute maximum possible number of levels is 21, and no entry will have anywhere near that, so it's probably not a huge performance issue, but some queries will ask for many records at once, and this has to be assembled for each record.Is there a way to do this assembly using a set-based query? It seems to me that there ought to be a way, using FOR XML, Cross Apply or some such, but my skills in the more esoteric regions of T-SQL are still somewhat lacking.

Trace Flag 4044

Posted: 11 Mar 2013 05:36 AM PDT

4044 - SA account can be unlocked by rebooting server with trace flag. If sa (or sso_role) password is lost, add this to your RUN_serverfile. This will generate new password when server started.[url]http://www.sqlservercentral.com/articles/trace+flags/70131/[/url][url]http://social.technet.microsoft.com/wiki/contents/articles/13105.trace-flags-in-sql-server.aspx[/url]I've seen this trace flag listed on a couple web sites but cannot get it to work. I've tried adding it as a startup parameter in configuration manager. SQL starts but there's no output and I can find the RUN_server file anywhere. All the searching I've done points to Sybase (it is also a Sybase trace flag).I was just interested to test it out. It could be handy when users install a local instance and forget their password.Has anyone else implemented this successfully?Thanks-Jeff

OUTPUT variables in Profiler

Posted: 11 Mar 2013 01:41 AM PDT

The following is what SQL Profiler shows for a stored procedure call:declare @p1 intset @p1=20611159exec dbo.Set_Phone_6 @p1 output,NULL,NULL,NULL,NULLselect @p1goIn the sproc, I have logic = if @p1 is null, then do an insert, set the value of @p1 to the newly created identity, and exit. If @p1 is not null, it's an update, do the update and exit.My developer is actually issuing the call to the sproc passing @p1 as NULL. The result of his call is an INSERT. Why is Profiler showing "set @p1=20611159"?Thanks!

Getting the files stored in a database table?

Posted: 11 Mar 2013 07:26 AM PDT

Hi All,I am struggling to find a solution / guidelines / 3rd party or internal database tools that allows me to extract files stored in a database table. These files have multiple formats, PDF, Word, Excel, Images etc.Please advise me, I have limited or no knowledge on visual studio.Thanks Yasar

T-SQL Code Review: Looping Record Deletion w. Date Validation

Posted: 11 Mar 2013 03:41 AM PDT

Hi there, I put this script together based off of some code I found in a few other threads on this forum after doing a search. This script is for a development environment, but after it is thoroughly tested and vetted it may be deployed to a prod environment. I am simpilying pruning old records from some staging tables. Any feedback or words of wisdom would be wonderful. Thanks![code="sql"]DECLARE @DeleteDate DATETIME , @RowsToDelete BIGINT-- Calculate the Min. Date plus 1 which will be the first range of dates to deleteSET @DeleteDate = ( SELECT DATEADD(DAY, 1, MIN([cslpnle].[DateCreated])) AS [DeleteDate] FROM [dbo].[RawVehicle] (NOLOCK) AS cslpnle )-- Delete rows while data older than 5 daysPRINT @DeleteDateWHILE DATEDIFF(DAY, @DeleteDate, GETUTCDATE()) > 1 BEGIN SET @RowsToDelete = 1 --Purge RawVehicles WHILE @RowsToDelete > 0 BEGIN DELETE TOP ( 200 ) [dbo].[RawVehicleOption] FROM [dbo].[RawVehicleOption] INNER JOIN [dbo].[RawVehicle] AS rv ON [dbo].[RawVehicleOption].[RawVehicleId] = [rv].[Id] WHERE [rv].[DateCreated] <= @DeleteDate DELETE TOP ( 200 ) [dbo].[RawInventoryPhoto] FROM [dbo].[RawInventoryPhoto] AS rip INNER JOIN [dbo].[RawInventoryVehicle] AS riv ON rip.[RawInventoryVehicleId] = [riv].[Id] INNER JOIN [dbo].[RawVehicle] AS rv ON [riv].[RawVehicleId] = [rv].[Id] WHERE [rv].[DateCreated] <= @DeleteDate DELETE TOP ( 200 ) [dbo].[RawDealerLotToInventoryVehicle] FROM [dbo].[RawDealerLotToInventoryVehicle] AS rdltiv INNER JOIN [dbo].[RawInventoryVehicle] AS riv ON [rdltiv].[RawInventoryVehicleId] = [riv].[Id] INNER JOIN [dbo].[RawVehicle] AS rv ON [riv].[RawVehicleId] = [rv].[Id] WHERE [rv].[DateCreated] <= @DeleteDate DELETE TOP ( 200 ) [dbo].[RawInventoryVehicle] FROM [dbo].[RawInventoryVehicle] AS riv INNER JOIN [dbo].[RawVehicle] AS rv ON [riv].[RawVehicleId] = [rv].[Id] WHERE [rv].[DateCreated] <= @DeleteDate DELETE TOP ( 200 ) FROM [dbo].[RawVehicle] WHERE [DateCreated] <= @DeleteDate SET @RowsToDelete = @@ROWCOUNT END SET @DeleteDate = DATEADD(DAY, 1, @DeleteDate) END [/code]

Permissions for DTEXEC

Posted: 11 Mar 2013 07:07 AM PDT

Any suggestions are welcome.I am calling DTEXEC like this in the context of database A.'"E:\Program Files (x86)\Microsoft SQL Server\100\DTS\Binn\dtexec.exe Mypackage"' The package does execute.The source query within the package calls a procedure in database A. The connection manager within the package is within the context of database A.The package executes EXEC MySchema.MyProcAWithin procedure MySchema.MyProcA a separate procedure in called. This procedure is in database B.So within procedure MySchema.MyProcA I do my cross database execution call.EXEC databaseB.schema.MyProcBThe error messageAn OLE DB record is available. Source: "Microsoft SQL Server Native Client 10.0" Hresult: 0x80040E14 Description: "schema.MyProcB: NT user not authorized".I assume the account I need to permissions is the SQL Server service account. Is this correct?

Index confusion

Posted: 11 Mar 2013 05:29 AM PDT

Hi everyone. I have a table with a few million rows in it. There is a process as part of a stored procedure that checks if data exists. It checks by selecting top 1 from the table prior to moving on.The developer told me that it gets slower and slower as more data is added to the table so we looked at ways to speed things up.They are running a query like this:SELECT TOP 1 * FROM TABLEA WHERE COLUMN4 = 'xxxx' and COLUMN6 = 'xxxx' and COLUMN9 = 'xxxx'There were no good indexes that would help us so I created a non clustered that includes COLUMN4, COLUMN6, and COLUMN9. After creating this index, the execution plan shows that the query is using the new index.My question is, why does the above query have the same execution plan as the below query?SELECT TOP 1 COLUMN4 FROM TABLEA WHERE COLUMN4 = 'xxxx' and COLUMN6 = 'xxxx' and COLUMN9 = 'xxxx'The second query can get all of its output from the index and should not need to visit the table at all right? The top query is requesting all rows and should need to get the unindexed rows from the actual table right?Why would the two selects have the same execution plan?Thanks for helping me understand.

Efficient Method for a stored Procedure

Posted: 11 Mar 2013 05:05 AM PDT

Hello, I am a Sql Server Newbie. I am writing a stored procedure which has one input parameter I have written it as shown below using two temporary tables, and I wanted to know if there is any better efficient way of writing it. -- I have simplified the stored procedure StoredProcedureX @Id Int (Id sent from another stored procedure)Create TempTable 1 (Name1 String, Name2 String, Name3 String, Marks int)Insert INTO TempTable1 select Name,Name1, Name2, Marks from TableA where Id = @Id.Create TempTable2 (Address,City)Insert Into TempTable2 select Address, City from TableA where Name = (select Name from TempTable1) ANDName1 = (select Name1 from TempTable1) AND Name2 = (select Name2 from TempTable1)AND MARKS > 100;--- Some other processing on TempTable2-- drop tables TempTable1 & TempTable2

Cross Apply Executing Too Many Times

Posted: 11 Mar 2013 06:02 AM PDT

I have a rather complex set of queries that are mostly embedded in table valued functions and call one another in chains. I've had a number of very peculiar performance issues that seem to come and go, but that don't seem to be directly related to the queries themselves (heavily indexed tables).Currently I'm examining a particular cross apply that seems to be behaving strangely. I'm simplifying a bit, but the two following queries return the "same" result, just different ways; the main difference is that the first one has an additional column hardcoded instead of relying on the cross join to fill it in. [quote]select 'LA13G12ST-05', 10000, pidcm.met, pidcm.quantityFound from table_valued_function('LA13G12ST-05', [b]10000[/b], '00000630', 1, 'S0000013122', 'LA2013') pidcmunion allselect 'LA13G12ST-05', 20000, pidcm.met, pidcm.quantityFound from table_valued_function('LA13G12ST-05', [b]20000[/b], '00000630', 1, 'S0000013122', 'LA2013') pidcm[/quote][quote]select 'LA13G12ST-05', pidc.[Line No_], pidcm.met, pidcm.quantityFound from (select distinct pidc.[Line No_] from table1 pid join table2 pidc on pid.[Pidc Code] = pidc.Code where pid.Code = 'LA2013' and pid.[Line No_] = 10000) pidc cross apply table_valued_function('LA13G12ST-05', [b]pidc.[Line No_][/b], '00000630', 1, 'S0000013122', 'LA2013') pidcm[/quote]Yes, the subquery does return exactly two rows, 10000 and 20000So, when I run the both queries, I get the expected (and identical) results from both[quote]LA13G12ST-05 10000 0 3LA13G12ST-05 20000 0 0LA13G12ST-05 10000 0 3LA13G12ST-05 20000 0 0[/quote]However, the statistics are very different:[quote]SQL Server parse and compile time: CPU time = 0 ms, elapsed time = 3 ms.(2 row(s) affected)Table '#71DCD509'. [b]Scan count 1, logical reads 1[/b], physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.Table '#70E8B0D0'. Scan count 1, logical reads 1, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0. SQL Server Execution Times: CPU time = 813 ms, [b]elapsed time = 809 ms[/b].(2 row(s) affected)Table '#71DCD509'. [b]Scan count 73, logical reads 78[/b], physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.Table 'Table2'. Scan count 1, logical reads 2, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.Table 'Table1'. Scan count 0, logical reads 2, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0. SQL Server Execution Times: CPU time = 28859 ms, [b]elapsed time = 28898 ms[/b].[/quote]Curious on why the latter required 73 times as many scans, I ran the SQL profiler, and found that the table valued function is running 72 times in the cross apply example.It may be purely a coincidence, but there happen to be 72 rows in the subquery if you remove the where clause from it. So, it looks to me as if SQL Server is unrolling my subquery, moving its "where" to the outer query, executing the cross apply, and then filtering. Unfortunately, this ends up being much more expensive.So, is there some way to convince the query optimizer to do otherwise? I had expecting that adding the "distinct" to the subquery (not technically needed) would help the optimizer know it should do the inner part first, but it didn't.Being from an Oracle background, the first thing that comes to mind is to put the inner query in a cursor, and loop through its results calling the function for each, so I can control the number of function calls. Such behavior seems to be frowned upon in this forum, and disparagingly referred to as RBAR. So, I'm open to alternatives.I'm sure someone will ask whether it is possible to avoid table valued functions and roll all the logic into a single SQL statement. It would be very difficult (perhaps impossible) because of two things:1) Analytic functions aren't fully available in SQL Server 2008R2 (and I can't easily upgrade to 2012); specifically, I need to limit something by a running total, and the "rows between" clause isn't available yet on the analytic sum function.2) In several cases, completely different queries need to run depending upon some other setting or function result (case statement runs different queries). It might be possible perhaps to join all the possible results and discard the ones that don't apply, but I'm not sure that would be good for performance anyway.Edit:I found a workaround, sort of; I can select the subquery into a temporary table and then cross apply against that temporary table instead of the subquery. That seems to solve the particular problem, but it also forces this function to be a multi-statement table valued function, whereas before it could have been in-line. Since it is calling a multi-statement TVF (one with a gigantic case), I suppose the distinction is largely irrelevant.I don't suppose there is some optimizer hint that says "this part is expensive"? I tried force order, but it didn't help (maybe it applies only to joins, not applies).

modifying a SP

Posted: 07 Mar 2013 08:03 AM PST

We already have a SP which selects data from db.I need to modify the SP so that it selects one more column called supervisor.The code to get the supervisor data is SELECT * INTo #sStaffNamesFROM (select p.personid,p.LastName + N', ' + p.FirstName AS Supervisor, S.StaffID---from Staff as s inner join people as pon s.PersonID = p.personid)A select peoplelinkid ,staffassigned,Supervisorfrom clientinfo as ci LEFT JOIN #StaffNames as s ON Ci.StaffAssigned=S.StaffIDwhere ci.staffassigned<>0 drop table #sstaffnamesThis code works perfectly and I am able to get the desired result.Now the problem is I am not able to figure out how to put the above code in a SP which selects data.Thanks in advance

After Error 823 Database stayed on "Restoration" and Can't Restore Can't Drop it

Posted: 11 Mar 2013 03:23 AM PDT

Hi all, We had a connection problem for some hours yesterday. All databases are re-attached automatically except one(DBNAME). One of them stayed on the "restoration" mode. So I Restored from backup DBNAME_RESTORE . So i really don't need this database. What i made : Drop Database DBNAME ---Result : successful---sp_renamedb 'DBNAME_RESTORE','DBNAME'---Result : Error , The database DBNAME already exists. Specify a database name unique.---When i have a look i see that the dabase has not dropped. I try to restore it : Restore database DBNAME with recovery Msg 823, Level 24, State 2, Line 1The operating system returned error 21 (The device is not ready.) To SQL Server during a read at offset 0x00000000012000 in file 'C:\DBNAME.mdf'. You may find more details in the Additional messages in the error log and system event log to SQL Server. It is a condition of severe system-level error that jeopardizes the integrity of the database and must be corrected immediately. Perform a complete check of the consistency of the database (DBCC CHECKDB). This error can be caused by many factors, for more information, refer to the documentation in SQL Server Books Online.I can't execute :dbcc checkdb (DBNAME) because it is on restoring state... Do you have an idea haow to drop this database ? Thanks in advance .

Isolation Level

Posted: 11 Mar 2013 01:59 AM PDT

Hello there, I have a db on a standby mode that is restoring from a primary one every 15 minutes. usres are logging on to the stand by db to select information using the nolock hint but the Db still locks and preventing from restoring after. What do you think i should do ? Should i change the isolation level to snapshot? Will that help?Thank you in advance. Riri

Table variable subquery as column

Posted: 07 Mar 2013 04:15 AM PST

Hi All,I'm trying to use a value from a table variable's column that matches w/a passed in variable as a subquery for one of the main query's column. I'm trying the below, but the values for @Region and @Queue are showing up as blank in a SSRS report field which are pulling from it:ALTER PROCEDURE [dbo].[udp_WorkQueuesReport] -- Add the parameters for the stored procedure here @FromDt as date, @ThruDt as date, @Region as varchar(max), @Queue as varchar(max)ASBEGIN-- SET NOCOUNT ON added to prevent extra result sets from -- interfering with SELECT statements. SET NOCOUNT ON; Declare @RegionTbl as table(RegionName varchar(50)) Declare @QueueTbl as table(QueueName varchar(50)) Insert @RegionTbl select Value from hpexprpt.dbo.split(@Region,',') Insert @QueueTbl select Value from hpexprpt.dbo.split(@Queue,',') select users.last_name + ',' + users.first_name as [User ID] , (Select RegionName from @RegionTbl where RegionName = @Region) as Region , (Select QueueName from @QueueTbl where QueueName = @Queue) as Queue ... from hpexpprod.dbo.work_items join...where...Any pointers would be greatly appreciated...Thanks in advance!

How to update multiple tables in sql server 2008 ?

Posted: 07 Mar 2013 10:58 PM PST

Hi All,I want to update multiple columns from multiple tables in a single UPDATE Query...Just want to do like below query...UPDATE Table1, Table2SET Table1.Column1 = 'one' ,Table2.Column2 = 'two'FROM Table1 T1, Table2 T2WHERE T1.id = T2.idand T1.id = 'id1'Does Sql Server 2008 provide any mechanism to do so?If Sql Server 2008 provide such nice approach, please share some links with me!Thanks!

Not consuming memory

Posted: 08 Mar 2013 12:37 AM PST

My company is rolling out a new product that was supposed to go to beta on March 1. That has been deferred, so the pressure is to get everything done yesterday.It is a complex multi-tiered application running web services, Citrix published apps, multiple databases and instances on a virtualized cluster. Sort of a kitchen sink of input sources. I had ZERO input on the database design and system architecture. So I'm having to learn the system in the middle of the problem.Which probably sounds familiar to most people here, no?The load test was focused on the web services so I was not allowed to capture any SQL statistics. I was only able to watch the defaults available through the Activity Monitor.The strangest thing during the test from the database end is that memory utilization peaked at 1.5 GB on an instance that had 28 GB assigned to it. Today we tested the instance with a few memory hogging scripts just to show that they were configured properly and, as expected, the memory was easily consumed.The load test had some interesting things happen. As the web requests loaded up the front end, the CPU climbed linearly - a nice direct correlation to the number of request hitting the web servers. But as soon as the CPU hit 25% it leveled off even as we doubled, tripled and quadrupled the number of web hits.More interesting is that there were two SQL instances in the test and when the CPU leveled off the waits displayed in the Activity Monitor started climbing up into the thousands. Even more curious is that the waits on each instance were inversely correlated. When one would peak, the other would be at a minimum in a very regular saw toothed pattern.So I have to recommend a "solution" without having any data since I wasn't allowed to pull any during the test.FWIW, disk I/O was fine. Today's memory test showed that the memory allocation of the instance is fine.My first recommendation is going to be to put all the databases on the same instance (there were two databases, one one each instance, that talked to each other a great deal) just to see how it effects the waits and the cross talk between those two databases.Then look at tempDB issues and insist that I be allowed to pull some performance counters DURING the test.I found the oscillation of peak waits very interesting. Has anyone ever seen this type of behavior?I'm not expecting any magic answers here. More just some possibilities so I can drill down into the lower levels.

Looging all executed SQL Server Queries

Posted: 10 Mar 2013 10:25 PM PDT

Hello Friends,I am using SQL 2008 R2.This instance of SQL Server ,many user uses.All user uses the same user sa and password.I want to log each query running by user with following details:User Name: may be [b]computer name/or IP address[/b]Query Text,date and time of query.Is it possible?if yes then how?

If I compress a primary key, am I compressing the whole table --> P-key with DATA_COMPRESSION

Posted: 05 Mar 2013 06:56 AM PST

Can someone explain how data compression works at the index level? If I compress a primary key, am I compressing the whole table [since the data pages are leaf pages]?

SQL.log file growing rapidly C:\Users\sqlaccount\AppData\Local\Temp

Posted: 10 Mar 2013 10:26 PM PDT

Hi Guys,On the development server I noticed a file called SQL in [b]C:\Users\sqlaccount\AppData\Local\Temp [/b] that has grown up to 10GB. The only way to get rid of it is to stop SQL Agent and delete file, but when i restart the agent the file is created again. It takes over a month to grow up to 10gb.I can confirm the SQL Agent ERROR log file is pointing to a different drive.Message in SQL log fileSQLAGENT 520-344 EXIT SQLGetDiagFieldW with return code 100 (SQL_NO_DATA_FOUND) SQLSMALLINT 3 SQLHANDLE 0x00000000002D6740 SQLSMALLINT 1 SQLSMALLINT -1154 SQLPOINTER 0x000000001CFC871C SQLSMALLINT -6 SQLSMALLINT * 0x0000000000000000Any ideas what this is ?

No comments:

Post a Comment

Search This Blog